Write a python code to Create a table and add the data into it and table must display in My sql
Module 5: Using
Database
s with Python
• File-based systems
• Database systems
• Relational databases
• Structured Query Language
ISM 6405- Dr. Milad Baghersad 1
Reference: Database Systems: Design, Implementation and Management, 12th edition by Carlos
Coronel and Steven Morris (Cengage Learning, ISBN: 978-1-305-62748-2)
History: traditional file-based systems
ISM 6405- Dr. Milad Baghersad 2
• Computerized manual filing systems
• Individual files for each purpose
• Each business unit had its own file system
• Organized to facilitate the expected use of the data
• Data processing specialist required to retrieve data and run
reports
Example: file-based systems
ISM 6405- Dr. Milad Baghersad 3
• Suppose, as a FAU student, that you need to do the following
things:
1. Register for courses
2. Pay tuition
3. Work part-time on campus
File Based Systems Illustration
ISM 6405- Dr. Milad Baghersad 4
TuitionRec (ID, name,
address, amtPaid,
balance, …)
RegRec (ID, name,
address, courses, …)
EmpRec (ID, name,
address, wage, …)
Tuition
payment
entry and
reports
File
handling
routines
File
definition
Course
registration
entry and
reports
File
handling
routines
File
definition
Work-study
data entry
and reports
File
handling
routines
File
definition
Tuition payment application programs
Course registration application programs
Work-study application programs
Payment file(s)
Registration file(s)
Issues with file-based systems
ISM 6405- Dr. Milad Baghersad 5
• Data redundancy
• Different files contain same information (ID, name, address, etc…)
• Isolation of data in separate systems
• Data inconsistency (e.g., different ID values for the same person)
• Lack of data integrity
• Data anomalies
• All changes may not be made successfully
• Update / Insertion / Deletion anomalies
Better approach: database systems
• Database: a collection of interrelated data
organized in such a way that it corresponds to the
needs and structure of an organization and can be
used by more than one person for more than one
application.
ISM 6405- Dr. Milad Baghersad 6
Example: database system
ISM 6405- Dr. Milad Baghersad 7
Tuition
payment
entry and
reports
Course
registration
entry and
reports
Work-study
data entry
and reports
Tuition payment application programs
Course registration application programs
Work-study application programs
DBMS
Database
Advantages of database systems
ISM 6405- Dr. Milad Baghersad 8
• Minimal data redundancy
• Data consistency
• Integration of data
• Improved data sharing
• Enforcement of standards
• Ease of application development
• Uniform security, privacy, and integrity
• Data independence from applications
• “self-describing” data stored in a data dictionary (metadata)
File-based systems often imply “flat” data files:
ISM 6405- Dr. Milad Baghersad 9
RecNo Name Address City State Zip Product Units Amount
1 John Smith 2
21
Main St. New York NY 08842 Television 1 $500
2 William Chin 43 1st Ave. Redmond WA 9833
2 Refrigerator 1 $800
3 William Chin 43 1st Ave. Redmond WA 9833
2 Toaster 1 $80
4 Marta Dieci 2 West Ave. Reno NV 9
23
42 Television 1 $500
5 Marta Dieci 2 West Ave. Reno NV 92342 Radio 1 $40
6 Marta Dieci 2 West Ave. Reno NV 92342 Stereo 1 $
20
0
7 Peter Melinkoff 53 NE Rodeo Miami FL
18
332 Computer 1 $
15
00
8 Martin Sengali 1234 5th St. Boston FL 0342
3 Television 1 $500
9 Martin Sengali 1234 5th St. Boston FL 0342
3 Stereo 1 $200
10 Martin Sengali 1234 5th St. Boston FL 0342
3 Radio 1 $40
11 Martin Sengali 1234 5th St. Boston FL 03423 Refrigerator 1 $80
Subset of the problems with a file-based
system:
ISM 6405- Dr. Milad Baghersad 10
• Redundancy (i.e. data duplication)
• Update anomalies
• Update a single address requires you to update multiple entries
• Insertion anomalies
• You cannot insert information about customer until they have actually
purchased something
• Deletion anomalies
• If there is a single instance of a given product being sold, and for some
reason we decide to remove the customer who purchased it, we will lose
information about the product also
Improving data management: Types of database
models
ISM 6405- Dr. Milad Baghersad 11
• Hierarchical database model
• Tree-based approach developed in
19
60’s
• Based on parent-child relationships (1:M)
• Network database model
• Created to improve on hierarchical model
• Allows records to have more than one parent
• Can access data from multiple points
• Relational database model
Relational database model
Turing Award Winner, Edgar F. Codd’s landmark paper, “A Relational Model of
Data for Large Shared Data Banks” (1970) laid out a new way to organize and
access data: the Relational Model.
12
Customer(CustomerID, Name, …
Order(OrderID, CustomerID, OrderDate, …
ItemsOrdered(OrderID, ItemID, Quantity, …
Items(ItemID, Description, Price, …
This Photo by Unknown
Author is licensed under
CC BY-SA
https://en.wikipedia.org/wiki/Edgar_F._Codd
https://creativecommons.org/licenses/by-sa/3.0/
Flat-file database
RecNo Name Address City State Zip Product Units Amount
1 John Smith 221 Main St. New York NY 08842 Television 1 $500
2 William Chin 43 1st Ave. Redmond WA 98332 Refrigerator 1 $800
3 William Chin 43 1st Ave. Redmond WA 98332 Toaster 1 $80
4 Marta Dieci 2 West Ave. Reno NV 92342 Television 1 $500
5 Marta Dieci 2 West Ave. Reno NV 92342 Radio 1 $40
6 Marta Dieci 2 West Ave. Reno NV 92342 Stereo 1 $200
7 Peter Melinkoff 53 NE Rodeo Miami FL 18332 Computer 1 $1500
8 Martin Sengali 1234 5th St. Boston FL 03423 Television 1 $500
9 Martin Sengali 1234 5th St. Boston FL 03423 Stereo 1 $200
10 Martin Sengali 1234 5th St. Boston FL 03423 Radio 1 $40
11 Martin Sengali 1234 5th St. Boston FL 03423 Refrigerator 1 $80
13
Example: relational database
14
Customer table
CusNo Name Address City State Zip
1 John Smith 221 Main St. New York NY 08842
2 William Chin 43 First Ave. Redmond WA 98332
3 Marta Dieci 2 West Ave. Reno NV 92342
4 Peter Melinkoff 53 NE Rodeo Miami FL 18332
5 Martin Sengali 1234 5th St. Boston MA 03423
CusNo Product Units Amount
1 Television 1 $500
2 Refrigerator 1 $800
2 Toaster 1 $80
3 Television 1 $500
3 Radio 1 $40
3 Stereo 1 $200
4 Computer 1 $1500
5 Television 1 $500
5 Stereo 1 $200
5 Radio 1 $40
5 Refrigerator 1 $800
OrderItem table
Another database in relational form
ISBN Title PubID Price
0-103-45678-9 Iliad 1 $25.00
0-11-345678-9 Moby Dick 3 $49.00
0-12-333433-3 On Liberty 1 $25.00
0-12-345678-9 Jane Eyre 3 $49.00
0-123-45678-0 Ulysses 2 $34.00
0-321-32132-1 Balloon 3 $34.00
0-55-123456-9 Main Street 3 $22.95
0-555-55555-9 MacBeth 2 $12.00
0-91-045678-5 Hamlet 2 $20.00
0-91-335678-7 Fairie Queene 1 $15.00
0-99-777777-7 King Lear 2 $49.00
0-99-999999-9 Emma 1 $20.00
1-1111-1111-1 C++ 1 $29.95
1-22-233700-0 Visual Basic 1 $25.00
AuID AuName AuPhone
1 Austen 111-111-1111
10 Jones 123-333-3333
11 Snoopy 321-321-22
22
12 Grumpy 321-321-0000
13 Sleepy 321-321-1111
2 Melville 222-222-2222
3 Homer 333-333-3333
4 Roman 444-444-4444
5 Shakespeare 555-555-5555
6 Joyce 666-666-6666
7 Spencer 777-777-7777
8 Mill 888-888-8888
9 Smith 123-222-2222
ISBN AuID
0-103-45678-9 3
0-11-345678-9 2
0-12-333433-3 8
0-12-345678-9 1
0-123-45678-0 6
0-321-32132-1 11
0-321-32132-1 12
0-321-32132-1 13
0-55-123456-9 9
0-55-123456-9 10
0-555-55555-9 5
0-91-045678-5 5
0-91-335678-7 7
0-99-777777-7 5
0-99-999999-9 1
1-1111-1111-1 4
1-22-233700-0 4
PubID PubName PubPhone
1 Big House 123-456-7890
2 Alpha Press 999-999-9999
3 Small House 714-000-0000
BOOK
AUTHOR
PUBLISHER
BOOK/AUTHOR
15
Install MySQL Server and Workbench
Use the video: How To Install MySQL (Server and
Workbench) [05:57]
Please note as you are installing MySql, when you are
selecting Products and Features (see [02:30] in the video),
click on MySQL Connectors and add Python connector also.
If you can not add Python connector, later I will teach you
another method to connect Python and MySQL.
16
Manipulating data in databases
Structured Query Language (SQL)
• creating database and table structures
• performing data manipulation and administration
• querying the database to extract useful information
17
Categories of SQL commands
• Data Definition Language (DDL)
– Commands that define a database, including creating, altering, and
dropping tables and stored procedures, and establishing constraints
• CREATE TABLE, set PRIMARY KEY
• Data Manipulation Language (DML)
– Commands that are used to manipulate data and extract information
• SELECT, UPDATE, INSERT, DELETE
18
Data types
• ANSI/ISO SQL data types:
– INTEGER / SMALLINT
– DECIMAL(precision, scale)
– CHAR(n) – fixed length character data
– VARCHAR(n) – variable length character data
– DATE – Julian date format
– plus several more…
• Other DBMS add additional data types.
19
MySQL data types (cont.)
Primary date and time types:
• DATE ‘YYYY-MM-DD’ format
range: ‘1000-01-01’ to ‘9999-12-31’
• DATETIME ‘YYYY-MM-DD HH:MM:SS’ format
range: ‘… 00:00:00’ to ‘… 23:59:59’
Invalid dates and times are converted to zero values: ‘0000-00-00’
Some built-in functions:
NOW( ), CURDATE( ), DATEDIFF( ), INTERVAL
DATE( ), TIME( ), DAY( ), YEAR( ), MONTH( ), etc.
20
Sample Database
Posted on course site:
• class.sql (MySQL 5.0)
• We will be running examples of queries on this database as
we cover the material, to help illustrate how the different
SQL statements work.
• I strongly recommend working through the examples on your
own as we go, in order to better understand them.
21
Sample Database (continued)
6 Tables:
22
StackOverflow- Mocking data for Angular2/typescript with my own
model type properties Licensed by Stack Exchange Network
https://stackoverflow.com/questions/41765329/mocking-data-for-angular2-typescript-with-my-own-model-type-properties
https://stackoverflow.com/legal/acceptable-use-policy
GROUP BY
Ex: For each Dept_Code, list:
The number of employees (as count) and the total credit limit (as
total)
23
{
“cells”: [
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“# Python Basics (Instructor: Dr. Milad Baghersad)\n”,
“## Module 5: Using Databases with Python”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“___\n”,
“___\n”,
“___\n”,
“___\n”,
“### Accessing a Database from Python”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“import mysql\n”,
“import mysql.connector”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“pip install mysql-connector-python”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“pip install mysql”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“import mysql\n”,
“import mysql.connector”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“database = mysql.connector.connect(host=\”localhost\”,user=\”root\”,password=\”12345\”, database=\”class\”)\n”,
“cursor = database.cursor()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“query = \”select * from class.employee\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“cursor.execute(query)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“employee_data = cursor.fetchall()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“employee_data”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“import pandas as pd\n”,
“employee_table= pd.DataFrame(employee_data)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“employee_table”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“employee_table = pd.DataFrame(employee_data,columns=(\”Emp_ID\”,\”Emp_FirstName\”, \”Emp_LastName\”,\”Dept_Code\”,\”Emp_HireDate\”,\n”,
” \”Emp_CreditLimit\”,\”Emp_Phone\”,\”Emp_MgrID\”))”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“employee_table”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“cursor.close()\n”,
“database.close()”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“___\n”,
“___\n”,
“___\n”,
“___\n”,
“### Insert values into SQL Server table using Python”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“import mysql\n”,
“import mysql.connector\n”,
“database = mysql.connector.connect(host=\”localhost\”,user=\”root\”,password=\”12345\”, database=\”class\”)\n”,
“cursor = database.cursor()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“#error!\n”,
“query = \”INSERT INTO class.employee (Emp_ID,Emp_FirstName, Emp_LastName,Dept_Code,Emp_HireDate,Emp_CreditLimit,Emp_Phone,Emp_MgrID) VALUES(11,’Chris’,\”Smith\”,\”ISM\”,\”2018-01-01\”,15,4444,22)\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“query = \”INSERT INTO class.employee (Emp_ID,Emp_FirstName, Emp_LastName,Dept_Code,Emp_HireDate,Emp_CreditLimit,Emp_Phone,Emp_MgrID) VALUES (11,’Chris’,’Smith’,’ISM’,’2018-01-01′,15,4444,22)\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“query = \”\”\”\n”,
“INSERT INTO class.employee (Emp_ID,Emp_FirstName, Emp_LastName,Dept_Code,Emp_HireDate,\n”,
“Emp_CreditLimit,Emp_Phone,Emp_MgrID) VALUES (12,’KKKK’,’Smith’,’ISM’,’2018-01-01′,15,4444,22)\n”,
“\”\”\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“query”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“cursor.execute(query)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“database.commit()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“cursor.close()\n”,
“database.close()”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“### Example:\n”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“import mysql\n”,
“import mysql.connector\n”,
“import csv”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“database = mysql.connector.connect(host=\”localhost\”,user=\”root\”,password=\”12345\”, database=\”class\”)\n”,
“cursor = database.cursor()\n”,
“query = \”\”\”CREATE TABLE class.grades (ID INT NOT NULL AUTO_INCREMENT,firstname VARCHAR(45) NULL,\n”,
“lastname VARCHAR(45) NULL,grade INT(11) NULL,PRIMARY KEY (ID));\”\”\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“cursor.execute(query)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“database.commit()\n”,
“cursor.close()\n”,
“database.close()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“database = mysql.connector.connect(host=\”localhost\”,user=\”root\”,password=\”12345\”, database=\”class\”)\n”,
“cursor = database.cursor()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“updatefile = open(\”Students.csv\”,’r’)\n”,
“content=updatefile.readlines()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“for line in content:\n”,
” print(line)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“for line in content:\n”,
” columns = line.split(\”,\”)\n”,
” FN = columns[0]\n”,
” LN = columns[1]\n”,
” GR = columns[2]\n”,
” \n”,
” query= \”INSERT class.grades (firstname, lastname, grade) VALUES (%(1)s, %(2)s, %(3)s)\” %{\”1\”: FN, \”2\”: LN, \”3\”: GR}\n”,
” print(query)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“database = mysql.connector.connect(host=\”localhost\”,user=\”root\”,password=\”12345\”, database=\”class\”)\n”,
“cursor = database.cursor()\n”,
“\n”,
“for line in content:\n”,
” columns = line.split(\”,\”)\n”,
” FN = columns[0]\n”,
” LN = columns[1]\n”,
” GR = columns[2]\n”,
” \n”,
” query= \”INSERT class.grades (firstname, lastname, grade) VALUES (‘%(1)s’, ‘%(2)s’, %(3)s)\” %{\”1\”: FN, \”2\”: LN, \”3\”: GR}\n”,
” #print(query)\n”,
” cursor.execute(query)\n”,
” database.commit() \n”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“cursor.close()\n”,
“database.close()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
}
],
“metadata”: {
“kernelspec”: {
“display_name”: “Python 3”,
“language”: “python”,
“name”: “python3”
},
“language_info”: {
“codemirror_mode”: {
“name”: “ipython”,
“version”: 3
},
“file_extension”: “.py”,
“mimetype”: “text/x-python”,
“name”: “python”,
“nbconvert_exporter”: “python”,
“pygments_lexer”: “ipython3”,
“version”: “3.7.3”
}
},
“nbformat”: 4,
“nbformat_minor”: 2
}
Description: Master Index of EDGAR Dissemination Feed
Last Data Received: March 31, 2018
Comments: webmaster@sec.gov
Anonymous FTP: ftp://ftp.sec.gov/edgar/
Cloud HTTP: https://www.sec.gov/Archives/
CIK|Company Name|Form Type|Date Filed|Filename
——————————————————————————–
1000032|BINCH JAMES G|4|2018-02-16|edgar/data/1000032/0000913165-18-000034.txt
1000045|NICHOLAS FINANCIAL INC|10-Q|2018-02-09|edgar/data/1000045/0001193125-18-037381.txt
1000045|NICHOLAS FINANCIAL INC|4|2018-02-15|edgar/data/1000045/0001000045-18-000004.txt
1000045|NICHOLAS FINANCIAL INC|4|2018-03-08|edgar/data/1000045/0001000045-18-000005.txt
1000045|NICHOLAS FINANCIAL INC|4|2018-03-20|edgar/data/1000045/0001609591-18-000001.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-01-09|edgar/data/1000045/0001193125-18-007253.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-05|edgar/data/1000045/0001193125-18-032199.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-07|edgar/data/1000045/0001193125-18-034693.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-20|edgar/data/1000045/0001193125-18-049706.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G/A|2018-02-12|edgar/data/1000045/0001104659-18-008485.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G/A|2018-02-14|edgar/data/1000045/0001037389-18-000160.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G|2018-02-09|edgar/data/1000045/0001258897-18-001316.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G|2018-02-13|edgar/data/1000045/0000315066-18-001444.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|13F-HR|2018-02-14|edgar/data/1000097/0000919574-18-001804.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-01-02|edgar/data/1000097/0000919574-18-000008.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001760.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001765.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001773.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001777.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001785.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G|2018-02-14|edgar/data/1000097/0000919574-18-001790.txt
1000177|NORDIC AMERICAN TANKERS Ltd|6-K|2018-02-28|edgar/data/1000177/0000919574-18-002148.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-01-19|edgar/data/1000177/0000919574-18-000510.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-02-08|edgar/data/1000177/0000919574-18-000929.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-02-22|edgar/data/1000177/0000919574-18-001989.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-03-02|edgar/data/1000177/0000919574-18-002224.txt
1000177|NORDIC AMERICAN TANKERS Ltd|SC 13G|2018-01-12|edgar/data/1000177/0000895421-18-000006.txt
1000177|NORDIC AMERICAN TANKERS Ltd|UPLOAD|2018-02-05|edgar/data/1000177/0000000000-18-004057.txt
1000177|NORDIC AMERICAN TANKERS Ltd|UPLOAD|2018-03-19|edgar/data/1000177/0000000000-18-008366.txt
1000184|SAP SE|20-F|2018-02-28|edgar/data/1000184/0001104659-18-013050.txt
1000184|SAP SE|6-K|2018-01-30|edgar/data/1000184/0001104659-18-005109.txt
1000184|SAP SE|6-K|2018-01-31|edgar/data/1000184/0001104659-18-005283.txt
1000184|SAP SE|6-K|2018-02-22|edgar/data/1000184/0001104659-18-011197.txt
1000184|SAP SE|6-K|2018-03-01|edgar/data/1000184/0001104659-18-013742.txt
1000184|SAP SE|6-K|2018-03-06|edgar/data/1000184/0001104659-18-015111.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005123.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005124.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005125.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005126.txt
Mitchell,Burns,100
Bryan,Dreyer,100
Domonique,Fair,100
Erotida,Gjino,100
Mario,Hanna,100
Nina,Hasrouni,100
Kaan,Koprulu,100
Johnathan,MacCurdy,100
Mohammed,Madkhli,100
Matthew,Martinez,100
# Dump File
#
# Database is ported from MS Access
#——————————————————–
# Program Version 3.0.138
CREATE DATABASE IF NOT EXISTS `class`;
USE `class`;
#
# Table structure for table ‘Department’
#
DROP TABLE IF EXISTS `Department`;
CREATE TABLE `Department` (
`Dept_Code` VARCHAR(3) NOT NULL,
`Dept_Name` VARCHAR(20),
INDEX (`Dept_Code`),
PRIMARY KEY (`Dept_Code`)
) ENGINE=myisam DEFAULT CHARSET=utf8;
SET autocommit=1;
#
# Dumping data for table ‘Department’
#
INSERT INTO `Department` (`Dept_Code`, `Dept_Name`) VALUES (‘Act’, ‘Accounting’);
INSERT INTO `Department` (`Dept_Code`, `Dept_Name`) VALUES (‘Exe’, ‘Executive’);
INSERT INTO `Department` (`Dept_Code`, `Dept_Name`) VALUES (‘Fin’, ‘Finance’);
INSERT INTO `Department` (`Dept_Code`, `Dept_Name`) VALUES (‘Mkt’, ‘Marketing’);
INSERT INTO `Department` (`Dept_Code`, `Dept_Name`) VALUES (‘Per’, ‘Personnel’);
INSERT INTO `Department` (`Dept_Code`, `Dept_Name`) VALUES (‘Sal’, ‘Sales’);
INSERT INTO `Department` (`Dept_Code`, `Dept_Name`) VALUES (‘Shp’, ‘Shipping’);
# 7 records
#
# Table structure for table ‘Employee’
#
DROP TABLE IF EXISTS `Employee`;
CREATE TABLE `Employee` (
`Emp_ID` INTEGER NOT NULL,
`Emp_FirstName` VARCHAR(10),
`Emp_LastName` VARCHAR(10),
`Dept_Code` VARCHAR(3),
`Emp_HireDate` DATETIME,
`Emp_CreditLimit` DECIMAL(19,4),
`Emp_Phone` VARCHAR(4),
`Emp_MgrID` INTEGER,
INDEX (`Dept_Code`),
INDEX (`Emp_ID`),
INDEX (`Emp_MgrID`),
PRIMARY KEY (`Emp_ID`)
) ENGINE=myisam DEFAULT CHARSET=utf8;
SET autocommit=1;
#
# Dumping data for table ‘Employee’
#
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (201, ‘Susan’, ‘Brown’, ‘Exe’, ‘1992-06-01 00:00:00’, 30, ‘3484’, NULL);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (202, ‘Jim’, ‘Kern’, ‘Sal’, ‘1995-08-15 00:00:00’, 26, ‘8722’, 201);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (203, ‘Martha’, ‘Woods’, ‘Shp’, ‘1997-02-01 00:00:00’, 25, ‘7591’, 201);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (204, ‘Ellen’, ‘Owens’, ‘Sal’, ‘1996-07-01 00:00:00’, 15, ‘6830’, 202);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (205, ‘Henry’, ‘Perkins’, ‘Sal’, ‘1998-03-01 00:00:00’, 25, ‘5286’, 202);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (206, ‘Carol’, ‘Rose’, ‘Act’, ‘1997-10-15 00:00:00’, 15, ‘3829’, 201);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (207, ‘Dan’, ‘Smith’, ‘Shp’, ‘1996-12-01 00:00:00’, 25, ‘2259’, 203);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (208, ‘Fred’, ‘Campbell’, ‘Shp’, ‘1997-04-01 00:00:00’, 25, ‘1752’, 203);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (209, ‘Paula’, ‘Jacobs’, ‘Mkt’, ‘1998-03-17 00:00:00’, 15, ‘3357’, 201);
INSERT INTO `Employee` (`Emp_ID`, `Emp_FirstName`, `Emp_LastName`, `Dept_Code`, `Emp_HireDate`, `Emp_CreditLimit`, `Emp_Phone`, `Emp_MgrID`) VALUES (210, ‘Nancy’, ‘Hoffman’, ‘Sal’, ‘1996-02-15 00:00:00’, 25, ‘2974’, 203);
# 10 records
#
# Table structure for table ‘Item’
#
DROP TABLE IF EXISTS `Item`;
CREATE TABLE `Item` (
`Item_Number` INTEGER NOT NULL,
`Item_Desc` VARCHAR(20),
`Item_Price` DECIMAL(19,4),
`Item_PriceIncrease` DECIMAL(19,4),
`Supplier_ID` VARCHAR(3),
PRIMARY KEY (`Item_Number`),
INDEX (`Supplier_ID`)
) ENGINE=myisam DEFAULT CHARSET=utf8;
SET autocommit=1;
#
# Dumping data for table ‘Item’
#
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (1, ‘Fresh Salad’, 2, .25, ‘Asp’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (2, ‘Soup of the Day’, 1.5, NULL, ‘Asp’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (3, ‘Sandwich’, 3.5, .4, ‘Asp’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (4, ‘Grilled steak’, 6, .7, ‘Cbc’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (5, ‘Hamburger’, 2.5, .3, ‘Cbc’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (6, ‘Broccoli’, 1, .05, ‘Frv’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (7, ‘French Fries’, 1.5, NULL, ‘Frv’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (8, ‘Soda’, 1.25, .25, ‘Jbr’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (9, ‘Coffee’, .85, .15, ‘Jbr’);
INSERT INTO `Item` (`Item_Number`, `Item_Desc`, `Item_Price`, `Item_PriceIncrease`, `Supplier_ID`) VALUES (10, ‘Dessert’, 3, .5, ‘Vsb’);
# 10 records
#
# Table structure for table ‘Lunch’
#
DROP TABLE IF EXISTS `Lunch`;
CREATE TABLE `Lunch` (
`Lunch_ID` INTEGER NOT NULL,
`Lunch_Date` DATETIME,
`Emp_ID` INTEGER,
INDEX (`Emp_ID`),
INDEX (`Lunch_ID`),
PRIMARY KEY (`Lunch_ID`)
) ENGINE=myisam DEFAULT CHARSET=utf8;
SET autocommit=1;
#
# Dumping data for table ‘Lunch’
#
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (1, ‘1998-11-16 00:00:00’, 201);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (2, ‘1998-11-16 00:00:00’, 202);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (3, ‘1998-11-16 00:00:00’, 203);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (4, ‘1998-11-16 00:00:00’, 207);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (5, ‘1998-11-16 00:00:00’, 206);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (6, ‘1998-11-16 00:00:00’, 210);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (7, ‘1998-11-25 00:00:00’, 201);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (8, ‘1998-11-25 00:00:00’, 205);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (9, ‘1998-11-25 00:00:00’, 204);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (10, ‘1998-11-25 00:00:00’, 207);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (11, ‘1998-11-25 00:00:00’, 208);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (12, ‘1998-12-04 00:00:00’, 201);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (13, ‘1998-12-04 00:00:00’, 203);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (14, ‘1998-12-04 00:00:00’, 205);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (15, ‘1998-12-04 00:00:00’, 210);
INSERT INTO `Lunch` (`Lunch_ID`, `Lunch_Date`, `Emp_ID`) VALUES (16, ‘1998-12-04 00:00:00’, 208);
# 16 records
#
# Table structure for table ‘Lunch_item’
#
DROP TABLE IF EXISTS `Lunch_item`;
CREATE TABLE `Lunch_item` (
`Lunch_ID` INTEGER NOT NULL,
`Item_Number` INTEGER NOT NULL,
`LI_Quantity` INTEGER,
INDEX (`Lunch_ID`),
PRIMARY KEY (`Lunch_ID`, `Item_Number`)
) ENGINE=myisam DEFAULT CHARSET=utf8;
SET autocommit=1;
#
# Dumping data for table ‘Lunch_item’
#
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (1, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (1, 3, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (1, 5, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (2, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (2, 3, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (2, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (2, 6, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (3, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (3, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (3, 7, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (3, 8, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (3, 9, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (4, 2, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (4, 3, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (4, 7, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (4, 10, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (5, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (5, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (5, 3, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (5, 5, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (5, 8, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (6, 3, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (6, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (6, 5, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (6, 6, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (6, 7, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (7, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (7, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (7, 8, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (8, 1, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (8, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (8, 5, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (8, 6, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (8, 8, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (8, 9, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (9, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (9, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (9, 3, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (9, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (10, 2, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (10, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (10, 7, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (10, 8, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (11, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (11, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (11, 3, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (11, 4, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (12, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (12, 3, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (12, 5, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (12, 8, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (12, 9, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (13, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (13, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (13, 3, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (13, 4, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (13, 5, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (14, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (14, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (14, 3, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (14, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (14, 5, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (15, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (15, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (15, 3, 2);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (15, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (16, 1, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (16, 2, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (16, 3, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (16, 4, 1);
INSERT INTO `Lunch_item` (`Lunch_ID`, `Item_Number`, `LI_Quantity`) VALUES (16, 5, 1);
# 71 records
#
# Table structure for table ‘Supplier’
#
DROP TABLE IF EXISTS `Supplier`;
CREATE TABLE `Supplier` (
`Supplier_ID` VARCHAR(3) NOT NULL,
`Supplier_Name` VARCHAR(30),
PRIMARY KEY (`Supplier_ID`),
INDEX (`Supplier_ID`)
) ENGINE=myisam DEFAULT CHARSET=utf8;
SET autocommit=1;
#
# Dumping data for table ‘Supplier’
#
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Arr’, ‘Alice & Ray\’s Restaurant’);
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Asp’, ‘A Soup Place’);
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Cbc’, ‘Certified Beef Company’);
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Frv’, ‘Frank Reed\’s Vegetables’);
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Fsn’, ‘Frank & Sons’);
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Jbr’, ‘Just Beverages’);
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Jps’, ‘Jim Parker\’s Shop’);
INSERT INTO `Supplier` (`Supplier_ID`, `Supplier_Name`) VALUES (‘Vsb’, ‘Virginia Street Bakery’);
# 8 records
{
“cells”: [
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“# Python Basics (Instructor: Dr. Milad Baghersad)\n”,
“## Module 3: Web Scraping with Python Part 1\n”,
“\n”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“___\n”,
“### Importing a plain text file (locally)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“filename = \”Tesla.txt\”\n”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“file = open(filename, mode= \”r\”) #read mode”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“text = file.read()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“print(text)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“file.close()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“with open(filename, mode= \”r\”) as file:\n”,
” text = file.read()\n”,
” print(text)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“with open(filename, mode= \”w\”) as file: #we can write\n”,
” file.write(text)\n”,
” #file.wrte(text)\n”,
” ”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“with open(filename, mode= \”a\”) as file: #we can append\n”,
” file.write(\”Apple\”)\n”,
” ”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“filepath = \”C:\\\\Users\\\\Milad\\\\Desktop\\\\ISM 6405\\\\Module 3\\\\Tesla.txt\” #Windows\n”,
“#filepath = \”C:/Users/Milad/Desktop/ISM 6405/Module 3/Tesla.txt\” #mac”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“with open(filepath, mode= \”r\”) as file:\n”,
” text = file.read()\n”,
” print(text)”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“—\n”,
“### Use pandas to import csv files (we learn more about pandas later):”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“import pandas as pd”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“dealership_data = pd.read_csv(\”dealership.csv\”, delimiter=\”,\”)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“print(dealership_data)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“dealership_data.at[0,\”Profit\”]”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“___\n”,
“### Importing a plain text file from web:”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“### Example:\n”,
“U.S. Securities and Exchange Commission (SEC):\n”,
“\n”,
“https://www.sec.gov/Archives/edgar/full-index/\n”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“#read the master file, open reports mentioned in the master file and check if a word exists in the report\n”,
“filepath = \”C:\\\\Users\\\\Milad\\\\Desktop\\\\ISM 6405\\\\Module 3\\\\master-2018Q1 – short version.idx\”\n”,
“\n”,
“with open(filepath,’r’) as mastertext:\n”,
” content=mastertext.readlines()\n”,
” \n”,
” for row in content:\n”,
” print (row)\n”,
” \n”,
” if len(row)!= 0:\n”,
” row = row.strip(‘\\n’)\n”,
” if str(row).endswith(\”.txt\”):\n”,
” columns = row.split(\”|\”)\n”,
” #print(columns)\n”,
” cik = columns[0]\n”,
” companyname = columns[1]\n”,
” formtype = columns[2]\n”,
” datefield = columns[3]\n”,
” filenames = columns[4]\n”,
” print(filenames)\n”,
” \n”,
” archivedUrl = \”https://www.sec.gov/Archives/\” + filenames\n”,
” print(archivedUrl)\n”,
” ”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“#read the master file, open reports mentioned in the master file and check if a word exists in the report\n”,
“filepath = \”C:\\\\Users\\\\Milad\\\\Desktop\\\\ISM 6405\\\\Module 3\\\\master-2018Q1 – short version.idx\”\n”,
“\n”,
“with open(filepath,’r’) as mastertext:\n”,
” content=mastertext.readlines()\n”,
” \n”,
” for row in content:\n”,
” print (row)\n”,
” \n”,
” if len(row)!= 0:\n”,
” row = row.strip(‘\\n’)\n”,
” if str(row).endswith(\”.txt\”):\n”,
” columns = row.split(\”|\”)\n”,
” #print(columns)\n”,
” cik = columns[0]\n”,
” companyname = columns[1]\n”,
” formtype = columns[2]\n”,
” datefield = columns[3]\n”,
” filenames = columns[4]\n”,
” \n”,
” archivedUrl = \”https://www.sec.gov/Archives/\” + filenames\n”,
” print(archivedUrl)\n”,
” \n”,
” import requests\n”,
” response = requests.get(archivedUrl)\n”,
” print(response.content[:400])\n”,
” report_text = response.text\n”,
” if \”merger\” in report_text.lower():\n”,
” print(\”YES!\”)\n”,
” else:\n”,
” print(\”NO\”)\n”,
” ”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“#save 10-Q reports\n”,
“filepath = \”C:\\\\Users\\\\Milad\\\\Desktop\\\\ISM 6405\\\\Module 3\\\\master-2018Q1 – short version.idx\”\n”,
“\n”,
“with open(filepath,’r’) as mastertext:\n”,
” content=mastertext.readlines()\n”,
” \n”,
” for row in content:\n”,
” print (row)\n”,
” \n”,
” if len(row)!= 0:\n”,
” row = row.strip(‘\\n’)\n”,
” if str(row).endswith(\”.txt\”):\n”,
” columns = row.split(\”|\”)\n”,
” #print(columns)\n”,
” cik = columns[0]\n”,
” companyname = columns[1]\n”,
” formtype = columns[2]\n”,
” datefield = columns[3]\n”,
” filenames = columns[4]\n”,
” \n”,
” archivedUrl = \”https://www.sec.gov/Archives/\” + filenames\n”,
” print(archivedUrl)\n”,
” \n”,
” response = requests.get(archivedUrl)\n”,
” response.encoding = ‘utf-8’\n”,
” print(response.content[:400])\n”,
” report_text = response.text\n”,
” \n”,
” filename = filenames.split(\”/\”)[-1]\n”,
” if formtype == \”10-Q\”:\n”,
” with open(filename, ‘w’) as f:\n”,
” f.write(report_text)”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“### save the code as a function:”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“def downloadreports(filepath):\n”,
” with open(filepath,’r’) as mastertext:\n”,
” content=mastertext.readlines()\n”,
” \n”,
” for row in content:\n”,
” print (row)\n”,
” \n”,
” if len(row)!= 0:\n”,
” row = row.strip(‘\\n’)\n”,
” if str(row).endswith(\”.txt\”):\n”,
” columns = row.split(\”|\”)\n”,
” #print(columns)\n”,
” cik = columns[0]\n”,
” companyname = columns[1]\n”,
” formtype = columns[2]\n”,
” datefield = columns[3]\n”,
” filenames = columns[4]\n”,
” \n”,
” archivedUrl = \”https://www.sec.gov/Archives/\” + filenames\n”,
” print(archivedUrl)\n”,
” \n”,
” r = requests.get(archivedUrl)\n”,
” r.encoding = ‘utf-8’\n”,
” print(r.content[:400])\n”,
” report_text = r.text\n”,
” \n”,
” filename = filenames.split(\”/\”)[-1]\n”,
” if formtype == \”10-Q\”:\n”,
” with open(filename, ‘w’) as f:\n”,
” f.write(report_text) ”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“downloadreports(filepath = \”C:\\\\Users\\\\Milad\\\\Desktop\\\\ISM 6405\\\\Module 3\\\\master-2018Q1 – short version.idx\”)”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“### Use urllib library to save a file from web:”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“from urllib.request import urlretrieve”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“archivedUrl = \”https://www.sec.gov/Archives/edgar/data/1000045/0001193125-18-037381.txt\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“urlretrieve(archivedUrl, \”downloaded_urllib.txt\”)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“___\n”,
“______\n”,
“___\n”,
“___\n”,
“___\n”,
“___\n”,
“___\n”,
“\n”,
“### Extract information from HTML:”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“#Example: save data about a product recall in https://www.cpsc.gov\n”,
“url = \”https://www.cpsc.gov/Recalls?combine=sofa&field_rc_date%5Bdate%5D=&field_rc_date_1%5Bdate%5D=\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“import requests\n”,
“response = requests.get(url)\n”,
“response.status_code”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“if response.status_code == 200:\n”,
” print(\”Success\”)\n”,
“else:\n”,
” print(\”Failure\”)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {
“scrolled”: false
},
“outputs”: [],
“source”: [
“sofa_recall_text = response.text\n”,
“print(sofa_recall_text)”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“### use BeautifulSoup package to add readability plus having useful functions\n”,
“https://www.crummy.com/software/BeautifulSoup/”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“from bs4 import BeautifulSoup”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“soup = BeautifulSoup(sofa_recall_text)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“print(soup.prettify())”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“### Functions in BeautifulSoup”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“print(soup.title)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“print(soup.title.get_text()) #get_text gets texts inside a tag”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“# find_all finds all instances of a tag\n”,
“links = soup.find_all(\”a\”) #hyperlinks are defined tag in HTML\n”,
“print(links)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“for link in links: \n”,
” print(link.get(\”href\”))”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“# find() finds the first instance of a tag\n”,
“first_link = soup.find(‘a’)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“print(first_link)”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“### use functions to collect data from CPSC”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“recalled_sofas = soup.find_all(\”div\”, class_=\”views-field views-field-php\”)\n”,
“print(recalled_sofas)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“soup.find(\”div\”, class_=\”views-field views-field-php\”)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“recall = soup.find(\”div\”, class_=\”views-field views-field-php\”)\n”,
“recall_link = recall.find(\”a\”)\n”,
“link_url = recall_link.get(‘href’)\n”,
“print(\”link url:\”,link_url)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“recall_title = recall.find(\”div\”, class_=\”title\”)\n”,
“print(recall_title)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“recall.find(\”div\”, class_=\”title\”).get_text()”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“—\n”,
“—\n”,
“—\n”,
“—\n”,
“\n”,
“### Write a function that returns recalls title, date, introduction, remedy, units, etc. when you enter a keyword”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“def get_recalls_v1(keywords):\n”,
” \n”,
” import requests\n”,
” from bs4 import BeautifulSoup\n”,
” \n”,
” url = \”https://www.cpsc.gov/Recalls?combine=\” + keywords\n”,
” response = requests.get(url)\n”,
” if not response.status_code == 200:\n”,
” print(\”the link is broken\”)\n”,
” return None\n”,
” \n”,
” recall_list = list()\n”,
” \n”,
” \n”,
” try:\n”,
” webpage_text = response.text\n”,
” soup = BeautifulSoup(webpage_text)\n”,
” recalls = soup.find_all(\”div\”, class_=\”views-field views-field-php\”)\n”,
” for recall in recalls:\n”,
” recall_title = recall.find(\”div\”, class_=\”title\”).get_text()\n”,
” \n”,
” recall_list.append((recall_title))\n”,
” return recall_list\n”,
” \n”,
” except:\n”,
” print(\”Error!!!\”)\n”,
” return None\n”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“get_recalls_v1(\”sofa\”)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“def get_recalls_v2(keywords):\n”,
” \n”,
” import requests\n”,
” from bs4 import BeautifulSoup\n”,
” \n”,
” url = \”https://www.cpsc.gov/Recalls?combine=\” + keywords\n”,
” response = requests.get(url)\n”,
” if not response.status_code == 200:\n”,
” print(\”the link is broken\”)\n”,
” return None\n”,
” \n”,
” recall_list = list()\n”,
” \n”,
” \n”,
” try:\n”,
” webpage_text = response.text\n”,
” soup = BeautifulSoup(webpage_text)\n”,
” recalls = soup.find_all(\”div\”, class_=\”views-field views-field-php\”)\n”,
” for recall in recalls:\n”,
” recall_title = recall.find(\”div\”, class_=\”title\”).get_text()\n”,
” recall_date = recall.find(\”div\”, class_=\”date\”).get_text()\n”,
” recall_introduction = recall.find(\”div\”, class_=\”introduction\”).get_text()\n”,
” recall_remedy = recall.find(\”div\”, class_=\”remedy\”).get_text()\n”,
” recall_units = recall.find(\”div\”, class_=\”units\”).get_text()\n”,
” recall_link = recall.find(\”a\”).get(\”href\”)\n”,
” \n”,
” recall_list.append((recall_title, recall_date, recall_introduction, recall_remedy, recall_units, recall_link))\n”,
” return recall_list\n”,
” \n”,
” except:\n”,
” print(\”Error!!!\”)\n”,
” return None”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“get_recalls_v2(\”sofa\”) ### will give error”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“def get_recalls_v3(keywords):\n”,
” \n”,
” import requests\n”,
” from bs4 import BeautifulSoup\n”,
” \n”,
” url = \”https://www.cpsc.gov/Recalls?combine=\” + keywords\n”,
” response = requests.get(url)\n”,
” if not response.status_code == 200:\n”,
” print(\”the link is broken\”)\n”,
” return None\n”,
” \n”,
” recall_list = list()\n”,
” \n”,
” \n”,
” try:\n”,
” webpage_text = response.text\n”,
” soup = BeautifulSoup(webpage_text)\n”,
” recalls = soup.find_all(\”div\”, class_=\”views-field views-field-php\”)\n”,
” for recall in recalls:\n”,
” recall_title = recall.find(\”div\”, class_=\”title\”).get_text()\n”,
” recall_date = recall.find(\”div\”, class_=\”date\”).get_text()\n”,
” recall_introduction = recall.find(\”div\”, class_=\”introduction\”).get_text()\n”,
” \n”,
” try:\n”,
” recall_remedy = recall.find(\”div\”, class_=\”remedy\”).get_text()\n”,
” except:\n”,
” recall_remedy = None\n”,
” try:\n”,
” recall_units = recall.find(\”div\”, class_=\”units\”).get_text()\n”,
” except:\n”,
” recall_units = None\n”,
” try:\n”,
” recall_link = recall.find(\”a\”).get(\”href\”)\n”,
” recall_link = \”https://www.cpsc.gov\” + recall_link\n”,
” except: \n”,
” return None\n”,
“\n”,
” recall_list.append((recall_title, recall_date, recall_introduction, recall_remedy, recall_units, recall_link))\n”,
” return recall_list\n”,
” \n”,
” except:\n”,
” print(\”Error!!!\”)\n”,
” return None”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“get_recalls_v3(\”sofa\”)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“get_recalls_v3(\”chair\”)”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“—\n”,
“—\n”,
“—\n”,
“—\n”,
“\n”,
“### Write a function that returns gets recall_link and returns (if possible) \”Manufactured In\”:”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“recall_link = \”https://www.cpsc.gov/Recalls/2019/CVB-Recalls-LUCID-Folding-Mattress-Sofas-Due-to-Violation-of-Federal-Mattress-Flammability-Standard\””
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“response = requests.get(recall_link)\n”,
“response.status_code”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“webpage_text = response.text\n”,
“soup = BeautifulSoup(webpage_text)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“soup”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“recall_product = soup.find_all(\”div\”,class_=\”field\”)\n”,
“print(recall_product)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“for item in recall_product:\n”,
” if item.find(\”div\”,class_=\”field-label\”).get_text() == \”Manufactured In: \”:\n”,
” manufactured_country = item.find(\”div\”,class_=\”field-item\”).get_text()”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“manufactured_country”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“#### function:”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“def get_recall_manufactured_country(recall_link):\n”,
” \n”,
” import requests\n”,
” from bs4 import BeautifulSoup\n”,
” \n”,
” response = requests.get(recall_link)\n”,
” if not response.status_code == 200:\n”,
” print(\”the link is broken\”)\n”,
” return None\n”,
” \n”,
” recall_info = list()\n”,
” \n”,
” \n”,
” try:\n”,
” webpage_text = response.text\n”,
” soup = BeautifulSoup(webpage_text)\n”,
” recall_product = soup.find_all(\”div\”,class_=\”field\”)\n”,
” try:\n”,
” for item in recall_product:\n”,
” if item.find(\”div\”,class_=\”field-label\”).get_text() == \”Manufactured In: \”:\n”,
” manufactured_country = item.find(\”div\”,class_=\”field-item\”).get_text()\n”,
” return manufactured_country \n”,
” \n”,
” except:\n”,
” return None\n”,
” \n”,
” except:\n”,
” return None”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“get_recall_manufactured_country(recall_link)”
]
},
{
“cell_type”: “markdown”,
“metadata”: {},
“source”: [
“—\n”,
“—\n”,
“—\n”,
“—\n”,
“\n”,
“### add manufactured country information \”get_recalls_v3(keywords)\” function”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“def get_recalls_v4(keywords):\n”,
” \n”,
” import requests\n”,
” from bs4 import BeautifulSoup\n”,
” \n”,
” url = \”https://www.cpsc.gov/Recalls?combine=\” + keywords\n”,
” response = requests.get(url)\n”,
” if not response.status_code == 200:\n”,
” print(\”the link is broken\”)\n”,
” return None\n”,
” \n”,
” recall_list = list()\n”,
” \n”,
” \n”,
” try:\n”,
” webpage_text = response.text\n”,
” soup = BeautifulSoup(webpage_text)\n”,
” recalls = soup.find_all(\”div\”, class_=\”views-field views-field-php\”)\n”,
” for recall in recalls:\n”,
” recall_title = recall.find(\”div\”, class_=\”title\”).get_text()\n”,
” recall_date = recall.find(\”div\”, class_=\”date\”).get_text()\n”,
” recall_introduction = recall.find(\”div\”, class_=\”introduction\”).get_text()\n”,
” \n”,
” try:\n”,
” recall_remedy = recall.find(\”div\”, class_=\”remedy\”).get_text()\n”,
” except:\n”,
” recall_remedy = None\n”,
” try:\n”,
” recall_units = recall.find(\”div\”, class_=\”units\”).get_text()\n”,
” except:\n”,
” recall_units = None\n”,
” try:\n”,
” recall_link = recall.find(\”a\”).get(\”href\”)\n”,
” recall_link = \”https://www.cpsc.gov\” + recall_link\n”,
” manufactured_country = get_recall_manufactured_country(recall_link)\n”,
” except: \n”,
” return None\n”,
” \n”,
” recall_list.append((recall_title, recall_date, recall_introduction, recall_remedy, recall_units, recall_link,\n”,
” manufactured_country))\n”,
” return recall_list\n”,
” \n”,
” except:\n”,
” print(\”Error!!!\”)\n”,
” return None”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
“get_recalls_v4(\”tv\”)”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: []
}
],
“metadata”: {
“kernelspec”: {
“display_name”: “Python 3”,
“language”: “python”,
“name”: “python3”
},
“language_info”: {
“codemirror_mode”: {
“name”: “ipython”,
“version”: 3
},
“file_extension”: “.py”,
“mimetype”: “text/x-python”,
“name”: “python”,
“nbconvert_exporter”: “python”,
“pygments_lexer”: “ipython3”,
“version”: “3.7.3”
}
},
“nbformat”: 4,
“nbformat_minor”: 2
}
Description: Master Index of EDGAR Dissemination Feed
Last Data Received: March 31, 2018
Comments: webmaster@sec.gov
Anonymous FTP: ftp://ftp.sec.gov/edgar/
Cloud HTTP: https://www.sec.gov/Archives/
CIK|Company Name|Form Type|Date Filed|Filename
——————————————————————————–
1000032|BINCH JAMES G|4|2018-02-16|edgar/data/1000032/0000913165-18-000034.txt
1000045|NICHOLAS FINANCIAL INC|10-Q|2018-02-09|edgar/data/1000045/0001193125-18-037381.txt
1000045|NICHOLAS FINANCIAL INC|4|2018-02-15|edgar/data/1000045/0001000045-18-000004.txt
1000045|NICHOLAS FINANCIAL INC|4|2018-03-08|edgar/data/1000045/0001000045-18-000005.txt
1000045|NICHOLAS FINANCIAL INC|4|2018-03-20|edgar/data/1000045/0001609591-18-000001.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-01-09|edgar/data/1000045/0001193125-18-007253.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-05|edgar/data/1000045/0001193125-18-032199.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-07|edgar/data/1000045/0001193125-18-034693.txt
1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-20|edgar/data/1000045/0001193125-18-049706.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G/A|2018-02-12|edgar/data/1000045/0001104659-18-008485.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G/A|2018-02-14|edgar/data/1000045/0001037389-18-000160.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G|2018-02-09|edgar/data/1000045/0001258897-18-001316.txt
1000045|NICHOLAS FINANCIAL INC|SC 13G|2018-02-13|edgar/data/1000045/0000315066-18-001444.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|13F-HR|2018-02-14|edgar/data/1000097/0000919574-18-001804.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-01-02|edgar/data/1000097/0000919574-18-000008.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001760.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001765.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001773.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001777.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001785.txt
1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G|2018-02-14|edgar/data/1000097/0000919574-18-001790.txt
1000177|NORDIC AMERICAN TANKERS Ltd|6-K|2018-02-28|edgar/data/1000177/0000919574-18-002148.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-01-19|edgar/data/1000177/0000919574-18-000510.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-02-08|edgar/data/1000177/0000919574-18-000929.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-02-22|edgar/data/1000177/0000919574-18-001989.txt
1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-03-02|edgar/data/1000177/0000919574-18-002224.txt
1000177|NORDIC AMERICAN TANKERS Ltd|SC 13G|2018-01-12|edgar/data/1000177/0000895421-18-000006.txt
1000177|NORDIC AMERICAN TANKERS Ltd|UPLOAD|2018-02-05|edgar/data/1000177/0000000000-18-004057.txt
1000177|NORDIC AMERICAN TANKERS Ltd|UPLOAD|2018-03-19|edgar/data/1000177/0000000000-18-008366.txt
1000184|SAP SE|20-F|2018-02-28|edgar/data/1000184/0001104659-18-013050.txt
1000184|SAP SE|6-K|2018-01-30|edgar/data/1000184/0001104659-18-005109.txt
1000184|SAP SE|6-K|2018-01-31|edgar/data/1000184/0001104659-18-005283.txt
1000184|SAP SE|6-K|2018-02-22|edgar/data/1000184/0001104659-18-011197.txt
1000184|SAP SE|6-K|2018-03-01|edgar/data/1000184/0001104659-18-013742.txt
1000184|SAP SE|6-K|2018-03-06|edgar/data/1000184/0001104659-18-015111.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005123.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005124.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005125.txt
1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005126.txt
{
“cells”: [
{
“cell_type”: “code”,
“execution_count”: 2,
“metadata”: {},
“outputs”: [],
“source”: [
“import mysql\n”,
“import mysql.connector”
]
},
{
“cell_type”: “code”,
“execution_count”: 3,
“metadata”: {},
“outputs”: [],
“source”: [
“import csv”
]
},
{
“cell_type”: “code”,
“execution_count”: 4,
“metadata”: {},
“outputs”: [],
“source”: [
“database = mysql.connector.connect(host=\”localhost\”,user=\”root\”,password=\”priya143\”, database=\”class\”)\n”,
“cursor = database.cursor()”
]
},
{
“cell_type”: “code”,
“execution_count”: 5,
“metadata”: {},
“outputs”: [
{
“ename”: “ProgrammingError”,
“evalue”: “1050 (42S01): Table ‘firmsreports’ already exists”,
“output_type”: “error”,
“traceback”: [
“\u001b[0;31m—————————————————————————\u001b[0m”,
“\u001b[0;31mMySQLInterfaceError\u001b[0m Traceback (most recent call last)”,
“\u001b[0;32m~/.local/lib/python3.7/site-packages/mysql/connector/connection_cext.py\u001b[0m in \u001b[0;36mcmd_query\u001b[0;34m(self, query, raw, buffered, raw_as_string)\u001b[0m\n\u001b[1;32m 488\u001b[0m \u001b[0mraw\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mraw\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbuffered\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mbuffered\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m–> 489\u001b[0;31m raw_as_string=raw_as_string)\n\u001b[0m\u001b[1;32m 490\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mMySQLInterfaceError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mexc\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n”,
“\u001b[0;31mMySQLInterfaceError\u001b[0m: Table ‘firmsreports’ already exists”,
“\nDuring handling of the above exception, another exception occurred:\n”,
“\u001b[0;31mProgrammingError\u001b[0m Traceback (most recent call last)”,
“\u001b[0;32m
“\u001b[0;32m~/.local/lib/python3.7/site-packages/mysql/connector/cursor_cext.py\u001b[0m in \u001b[0;36mexecute\u001b[0;34m(self, operation, params, multi)\u001b[0m\n\u001b[1;32m 264\u001b[0m result = self._cnx.cmd_query(stmt, raw=self._raw,\n\u001b[1;32m 265\u001b[0m \u001b[0mbuffered\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_buffered\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m–> 266\u001b[0;31m raw_as_string=self._raw_as_string)\n\u001b[0m\u001b[1;32m 267\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mMySQLInterfaceError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mexc\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 268\u001b[0m raise errors.get_mysql_exception(msg=exc.msg, errno=exc.errno,\n”,
“\u001b[0;32m~/.local/lib/python3.7/site-packages/mysql/connector/connection_cext.py\u001b[0m in \u001b[0;36mcmd_query\u001b[0;34m(self, query, raw, buffered, raw_as_string)\u001b[0m\n\u001b[1;32m 490\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mMySQLInterfaceError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mexc\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 491\u001b[0m raise errors.get_mysql_exception(exc.errno, msg=exc.msg,\n\u001b[0;32m–> 492\u001b[0;31m sqlstate=exc.sqlstate)\n\u001b[0m\u001b[1;32m 493\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mAttributeError\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 494\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_unix_socket\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n”,
“\u001b[0;31mProgrammingError\u001b[0m: 1050 (42S01): Table ‘firmsreports’ already exists”
]
}
],
“source”: [
“query =\”CREATE TABLE class.firmsreports (id INT NOT NULL AUTO_INCREMENT,cik INT NULL,companyname VARCHAR(500) NULL,formtype VARCHAR(45) NULL,datafile DATE NULL,filename VARCHAR(500) NULL,PRIMARY KEY (id));\”\n”,
“cursor.execute (query)”
]
},
{
“cell_type”: “code”,
“execution_count”: 6,
“metadata”: {},
“outputs”: [],
“source”: [
“idxfile = \”Mod05_master-2018Q1-v2.idx\””
]
},
{
“cell_type”: “code”,
“execution_count”: 16,
“metadata”: {},
“outputs”: [
{
“name”: “stdout”,
“output_type”: “stream”,
“text”: [
“Description: Master Index of EDGAR Dissemination Feed\n”,
“\n”,
“Last Data Received: March 31, 2018\n”,
“\n”,
“Comments: webmaster@sec.gov\n”,
“\n”,
“Anonymous FTP: ftp://ftp.sec.gov/edgar/\n”,
“\n”,
“Cloud HTTP: https://www.sec.gov/Archives/\n”,
“\n”,
“\n”,
“\n”,
” \n”,
“\n”,
” \n”,
“\n”,
” \n”,
“\n”,
“CIK|Company Name|Form Type|Date Filed|Filename\n”,
“\n”,
“——————————————————————————–\n”,
“\n”,
“1000032|BINCH JAMES G|4|2018-02-16|edgar/data/1000032/0000913165-18-000034.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|10-Q|2018-02-09|edgar/data/1000045/0001193125-18-037381.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|4|2018-02-15|edgar/data/1000045/0001000045-18-000004.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|4|2018-03-08|edgar/data/1000045/0001000045-18-000005.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|4|2018-03-20|edgar/data/1000045/0001609591-18-000001.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|8-K|2018-01-09|edgar/data/1000045/0001193125-18-007253.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-05|edgar/data/1000045/0001193125-18-032199.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-07|edgar/data/1000045/0001193125-18-034693.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|8-K|2018-02-20|edgar/data/1000045/0001193125-18-049706.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|SC 13G/A|2018-02-12|edgar/data/1000045/0001104659-18-008485.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|SC 13G/A|2018-02-14|edgar/data/1000045/0001037389-18-000160.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|SC 13G|2018-02-09|edgar/data/1000045/0001258897-18-001316.txt\n”,
“\n”,
“1000045|NICHOLAS FINANCIAL INC|SC 13G|2018-02-13|edgar/data/1000045/0000315066-18-001444.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|13F-HR|2018-02-14|edgar/data/1000097/0000919574-18-001804.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-01-02|edgar/data/1000097/0000919574-18-000008.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001760.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001765.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001773.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001777.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G/A|2018-02-14|edgar/data/1000097/0000919574-18-001785.txt\n”,
“\n”,
“1000097|KINGDON CAPITAL MANAGEMENT, L.L.C.|SC 13G|2018-02-14|edgar/data/1000097/0000919574-18-001790.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|6-K|2018-02-28|edgar/data/1000177/0000919574-18-002148.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-01-19|edgar/data/1000177/0000919574-18-000510.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-02-08|edgar/data/1000177/0000919574-18-000929.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-02-22|edgar/data/1000177/0000919574-18-001989.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|CORRESP|2018-03-02|edgar/data/1000177/0000919574-18-002224.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|SC 13G|2018-01-12|edgar/data/1000177/0000895421-18-000006.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|UPLOAD|2018-02-05|edgar/data/1000177/0000000000-18-004057.txt\n”,
“\n”,
“1000177|NORDIC AMERICAN TANKERS Ltd|UPLOAD|2018-03-19|edgar/data/1000177/0000000000-18-008366.txt\n”,
“\n”,
“1000184|SAP SE|20-F|2018-02-28|edgar/data/1000184/0001104659-18-013050.txt\n”,
“\n”,
“1000184|SAP SE|6-K|2018-01-30|edgar/data/1000184/0001104659-18-005109.txt\n”,
“\n”,
“1000184|SAP SE|6-K|2018-01-31|edgar/data/1000184/0001104659-18-005283.txt\n”,
“\n”,
“1000184|SAP SE|6-K|2018-02-22|edgar/data/1000184/0001104659-18-011197.txt\n”,
“\n”,
“1000184|SAP SE|6-K|2018-03-01|edgar/data/1000184/0001104659-18-013742.txt\n”,
“\n”,
“1000184|SAP SE|6-K|2018-03-06|edgar/data/1000184/0001104659-18-015111.txt\n”,
“\n”,
“1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005123.txt\n”,
“\n”,
“1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005124.txt\n”,
“\n”,
“1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005125.txt\n”,
“\n”,
“1000184|SAP SE|DFAN14A|2018-01-30|edgar/data/1000184/0001104659-18-005126.txt\n”,
“Mod05_master-2018Q1-v2.idx\n”
]
}
],
“source”: [
“with open(idxfile,’r’) as mastertext:\n”,
” content=mastertext.readlines()\n”,
“\n”,
” for row in content:\n”,
” print (row)\n”,
” if len(row)!= 0:\n”,
” row = row.strip(‘\\n’)\n”,
” if str(row).endswith(\”.txt\”):\n”,
” columns = row.split(\”|\”)\n”,
” #print(columns)\n”,
” cik = columns[0]\n”,
” companyname = columns[1]\n”,
” formtype = columns[2]\n”,
” datefield = columns[3]\n”,
” filenames = columns[4]\n”,
” print(idxfile)\n”,
” \n”,
” \n”,
” ”
]
},
{
“cell_type”: “code”,
“execution_count”: null,
“metadata”: {},
“outputs”: [],
“source”: [
” \n”,
” query= \”INSERT class.grades (cik, companyname, formtype, datefield, filenames) VALUES (‘%(1)s’, ‘%(2)s’, %(3)s, ‘%(4)s’, ‘%(4)s’)\” %{\”1\”: companyname1, \”2\”: formtype1, \”3\”: datefield1, \”4\”:filenames1 }\n”,
” cursor.execute(query)\n”,
” database.commit()”
]
}
],
“metadata”: {
“kernelspec”: {
“display_name”: “Python 3”,
“language”: “python”,
“name”: “python3”
},
“language_info”: {
“codemirror_mode”: {
“name”: “ipython”,
“version”: 3
},
“file_extension”: “.py”,
“mimetype”: “text/x-python”,
“name”: “python”,
“nbconvert_exporter”: “python”,
“pygments_lexer”: “ipython3”,
“version”: “3.7.4”
}
},
“nbformat”: 4,
“nbformat_minor”: 2
}
We provide professional writing services to help you score straight A’s by submitting custom written assignments that mirror your guidelines.
Get result-oriented writing and never worry about grades anymore. We follow the highest quality standards to make sure that you get perfect assignments.
Our writers have experience in dealing with papers of every educational level. You can surely rely on the expertise of our qualified professionals.
Your deadline is our threshold for success and we take it very seriously. We make sure you receive your papers before your predefined time.
Someone from our customer support team is always here to respond to your questions. So, hit us up if you have got any ambiguity or concern.
Sit back and relax while we help you out with writing your papers. We have an ultimate policy for keeping your personal and order-related details a secret.
We assure you that your document will be thoroughly checked for plagiarism and grammatical errors as we use highly authentic and licit sources.
Still reluctant about placing an order? Our 100% Moneyback Guarantee backs you up on rare occasions where you aren’t satisfied with the writing.
You don’t have to wait for an update for hours; you can track the progress of your order any time you want. We share the status after each step.
Although you can leverage our expertise for any writing task, we have a knack for creating flawless papers for the following document types.
Although you can leverage our expertise for any writing task, we have a knack for creating flawless papers for the following document types.
From brainstorming your paper's outline to perfecting its grammar, we perform every step carefully to make your paper worthy of A grade.
Hire your preferred writer anytime. Simply specify if you want your preferred expert to write your paper and we’ll make that happen.
Get an elaborate and authentic grammar check report with your work to have the grammar goodness sealed in your document.
You can purchase this feature if you want our writers to sum up your paper in the form of a concise and well-articulated summary.
You don’t have to worry about plagiarism anymore. Get a plagiarism report to certify the uniqueness of your work.
Join us for the best experience while seeking writing assistance in your college life. A good grade is all you need to boost up your academic excellence and we are all about it.
We create perfect papers according to the guidelines.
We seamlessly edit out errors from your papers.
We thoroughly read your final draft to identify errors.
Work with ultimate peace of mind because we ensure that your academic work is our responsibility and your grades are a top concern for us!
Dedication. Quality. Commitment. Punctuality
Here is what we have achieved so far. These numbers are evidence that we go the extra mile to make your college journey successful.
We have the most intuitive and minimalistic process so that you can easily place an order. Just follow a few steps to unlock success.
We understand your guidelines first before delivering any writing service. You can discuss your writing needs and we will have them evaluated by our dedicated team.
We write your papers in a standardized way. We complete your work in such a way that it turns out to be a perfect description of your guidelines.
We promise you excellent grades and academic excellence that you always longed for. Our writers stay in touch with you via email.