diff --git a/lab-p5/README.md b/lab-p5/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..e503061aa353ee167e9c219e64156b40d40a0c06
--- /dev/null
+++ b/lab-p5/README.md
@@ -0,0 +1,90 @@
+# Lab P5: Looping Patterns and Hurricane API
+
+Let us start Lab-P5! This lab introduces you to some fundamental looping patterns that will help solve P5. It is
+designed to help you become comfortable using the functions in `project.py`. You will also
+learn basic methods to manipulate strings needed for P5.
+
+## Corrections and clarifications
+
+None yet.
+
+**Find any issues?** Report to us:
+
+- Ashwin Maran <amaran@wisc.edu>
+
+----------------------------------
+## Learning Objectives:
+In this lab you will practice:
+- inspecting the `project.py` file,
+- iterating through data using a `for` loop,
+- writing algorithms to search, filter, count, and find min / max,
+- writing algorithms that store/use indices
+- writing helper functions,
+- writing algorithms that manipulate strings.
+----------------------------------
+
+## Introduction:
+In this lab, you will look at hurricane data and learn techniques to extract specific data. Data
+scientists use such data when studying the effects of climate change.
+
+According to the [Center for Climate and Energy Solutions](https://www.c2es.org/content/hurricanes-and-climate-change/),
+"Climate change is worsening
+hurricane impacts in the United States by increasing the intensity and decreasing the speed at
+which they travel. Scientists are currently uncertain whether there will be a change in the number
+of hurricanes, but they are certain that the intensity and severity of hurricanes will continue to
+increase. These trends make hurricanes far more costly in terms of physical damage and deaths."
+By tracking past hurricanes' speed, number of fatalities, and property damage, scientists can prepare for
+future ones.
+
+------------------------------
+
+## Note on Academic Misconduct
+
+You may do these lab exercises only with your project partner; you are not allowed to start working on Lab-P5 with one person, then do the project with a different partner. Now may be a good time to review [our course policies](https://cs220.cs.wisc.edu/f23/syllabus.html).
+
+------------------------------
+
+## Project partner
+
+We strongly recommend students find a project partner. Pair programming is a great way to learn from a fellow student. Project difficulty increases exponentially in this course. Finding a project partner early on during the semester is a good idea.
+
+If you are still looking for a project partner, take a moment now to ask around the room if anyone would like to partner with you on this project. Then you can work with them on this lab as well as the project.
+
+----------------------------------
+
+## Segment 1: Setup
+
+Create a `lab-p5` directory and download the following files into the `lab-p5` directory:
+
+* `hurricanes.csv`
+* `project.py`
+* `lab-p5.ipynb`
+* `public_tests.py`
+
+Once you have downloaded the files, open a terminal and navigate to your `lab-p5` directory.
+Run `ls` to make sure the above files are available.
+
+**Note:** If you accidentally downloaded the file as a `.txt` instead of `.csv` (or `.cvs` or `.csv.txt`)
+(say `hurricanes.cvs`), you can execute `mv hurricanes.cvs hurricanes.csv` on a
+Terminal/PowerShell window. Recall that the `mv` (move) command lets you rename a source file
+(first argument, example: `hurricanes.cvs`) to the destination file (second argument, example:
+`hurricanes.csv`).
+
+----------------------------------
+## Segment 2: Learning the API
+
+You will be finishing the rest of your lab on `lab-p5.ipynb`. Run the command `jupyter notebook` from your Terminal/PowerShell window.
+Remember not to close this
+Terminal/PowerShell window while Jupyter is running, and open a new Terminal/PowerShell
+window if necessary.
+
+**Note**: For P5, you will be working on `p5.ipynb`, which is very similar to `lab-p5.ipynb`.
+It is strongly recommended that you finish this notebook before moving on to P5,
+so you can ask your TA/PM any questions about the notebook that may arise.
+
+**Note**: Unlike `p5.ipynb`, you do **not** have to submit `lab-p5.ipynb`. This notebook is solely
+for your practice and preparation for P5.
+
+------------------------------
+
+You can now get started with [P5]((https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/p5)). **You may use any helper functions created here in project P5**. Remember to only work with P5 with your partner from this point on. Have fun!
diff --git a/lab-p5/hurricanes.csv b/lab-p5/hurricanes.csv
new file mode 100644
index 0000000000000000000000000000000000000000..fb683b4ce7f15962bd6e5b326b4c4004d2f6c86a
--- /dev/null
+++ b/lab-p5/hurricanes.csv
@@ -0,0 +1,555 @@
+name,formed,dissipated,mph,damage,deaths
+1804 New England hurricane,10/04/1804,10/11/1804,110,100K,16
+1806 Great Coastal hurricane,08/17/1806,08/25/1806,110,171K,24
+1812 Louisiana hurricane,08/15/1812,08/20/1812,115,6M,100
+1821 Norfolk and Long Island hurricane,09/01/1821,09/04/1821,130,200K,22
+1848 Tampa Bay hurricane,09/23/1848,09/28/1848,130,20K,0
+1867 San Narciso hurricane,10/27/1867,10/31/1867,125,1M,811
+1875 Indianola hurricane,09/08/1875,09/18/1875,115,4M,800
+Gale of 1878,10/18/1878,10/23/1878,105,2M,71
+1886 Indianola hurricane,08/12/1886,08/21/1886,150,200K,74
+1887 Halloween tropical storm,10/29/1887,11/06/1887,70,7K,2
+1891 Martinique hurricane,08/18/1891,08/25/1891,125,10M,700
+1893 Sea Islands hurricane,08/15/1893,09/02/1893,120,1M,2000
+1893 Cheniere Caminada hurricane,09/27/1893,10/05/1893,130,5M,2000
+1894 Greater Antilles hurricane,09/18/1894,10/01/1894,120,5.09M,227
+1896 Cedar Keys hurricane,09/22/1896,09/30/1896,125,338M,202
+1896 East Coast hurricane,10/07/1896,10/13/1896,100,500K,4
+1898 Windward Islands hurricane,09/05/1898,09/19/1898,110,2.5M,392
+1898 Georgia hurricane,09/25/1898,10/06/1898,130,1.5M,179
+1899 Carrabelle hurricane,07/28/1899,08/02/1899,100,1M,9
+1899 San Ciriaco hurricane,08/03/1899,09/12/1899,150,20M,3855
+1900 Galveston hurricane,08/27/1900,09/15/1900,145,1.25B,8000
+1901 Louisiana hurricane,08/02/1901,08/18/1901,90,1M,15
+1903 Jamaica hurricane,08/06/1903,08/16/1903,120,10M,188
+1903 Florida hurricane,09/09/1903,09/16/1903,90,500K,14
+1903 New Jersey hurricane,09/12/1903,09/17/1903,100,8M,57
+1906 Mississippi hurricane,09/19/1906,09/29/1906,120,19221,134
+1906 Florida Keys hurricane,10/08/1906,10/23/1906,120,4.14M,240
+1909 Velasco hurricane,07/13/1909,07/22/1909,115,2M,41
+1909 Monterrey hurricane,08/20/1909,08/28/1909,120,50M,4000
+1909 Grand Isle hurricane,09/13/1909,09/22/1909,120,11M,400
+1909 Florida Keys hurricane,10/06/1909,10/13/1909,120,3M,34
+1909 Greater Antilles hurricane,11/08/1909,11/14/1909,105,10M,198
+1910 Cuba hurricane,10/09/1910,10/23/1910,150,1.25M,116
+1912 Jamaica hurricane,11/11/1912,11/22/1912,115,1.5M,105
+1915 Galveston hurricane,08/05/1915,08/23/1915,145,30M,405
+1915 New Orleans hurricane,09/21/1915,10/01/1915,145,13M,279
+1916 Gulf Coast hurricane,06/28/1916,07/10/1916,120,12.5M,34
+1916 Charleston hurricane,07/11/1916,07/15/1916,115,22M,84
+1916 Texas hurricane,08/12/1916,08/20/1916,130,11.8M,37
+1916 Virgin Islands hurricane,10/06/1916,10/15/1916,120,2M,41
+1916 Pensacola hurricane,10/09/1916,10/19/1916,110,100K,29
+1917 Nueva Gerona hurricane,09/20/1917,09/30/1917,150,2.17M,44
+1919 Florida Keys hurricane,09/02/1919,09/16/1919,150,22M,772
+1920 Louisiana hurricane,09/16/1920,09/23/1920,100,1.45M,1
+1921 Tampa Bay hurricane,10/20/1921,10/30/1921,140,10M,8
+1925 Florida tropical storm,11/27/1925,12/01/1925,65,3M,73
+1926 Nassau hurricane,07/22/1926,08/02/1926,140,7.85M,287
+1926 Louisiana hurricane,08/20/1926,08/27/1926,115,6M,25
+1926 Miami hurricane,09/11/1926,09/22/1926,150,100M,539
+1927 Nova Scotia hurricane,08/18/1927,08/29/1927,125,1.6M,192
+1928 Fort Pierce hurricane,08/03/1928,08/14/1928,105,235K,2
+1928 Haiti hurricane,08/07/1928,08/17/1928,90,2M,210
+1928 Okeechobee hurricane,09/06/1928,09/21/1928,160,1.7B,4112
+1929 Bahamas hurricane,09/22/1929,10/04/1929,155,9.31M,155
+1930 San Zenón hurricane,08/29/1930,09/17/1930,155,50M,8000
+1931 British Honduras hurricane,09/06/1931,09/13/1931,130,7.5M,2500
+1932 Freeport hurricane,08/12/1932,08/15/1932,150,7.5M,40
+1932 Florida–Alabama hurricane,08/26/1932,09/04/1932,85,229K,1
+1932 San Ciprián hurricane,09/25/1932,10/02/1932,145,35.8M,272
+1932 Cuba hurricane,10/30/1932,11/14/1932,175,40M,3103
+1933 Trinidad hurricane,06/24/1933,07/08/1933,110,7.2M,35
+1933 Florida–Mexico hurricane,07/24/1933,08/05/1933,90,67.8M,39
+1933 Chesapeake–Potomac hurricane,08/13/1933,08/28/1933,140,41.2M,47
+1933 Cuba–Brownsville hurricane,08/22/1933,09/05/1933,160,27.9M,179
+1933 Treasure Coast hurricane,08/31/1933,09/07/1933,140,3M,3
+1933 Outer Banks hurricane,09/08/1933,09/22/1933,140,4.75M,24
+1933 Tampico hurricane,09/16/1933,09/25/1933,160,5M,184
+1933 Cuba–Bahamas hurricane,10/01/1933,10/09/1933,125,1.1M,10
+1934 Central America hurricane,06/04/1934,06/21/1934,100,9.46M,506
+1935 Labor Day hurricane,08/29/1935,09/10/1935,185,100M,423
+1935 Cuba hurricane,09/23/1935,10/02/1935,140,14.5M,52
+1935 Jérémie hurricane,10/18/1935,10/27/1935,85,16M,2150
+1935 Yankee hurricane,10/30/1935,11/08/1935,105,5.5M,19
+1936 Mid,09/08/1936,09/25/1936,120,4.05M,2
+1938 New England hurricane,09/09/1938,09/23/1938,160,306M,682
+1940 Louisiana hurricane,08/03/1940,08/10/1940,100,10.8M,7
+1940 South Carolina hurricane,08/05/1940,08/15/1940,100,13M,50
+1940 New England hurricane,08/26/1940,09/03/1940,110,4.05M,7
+1941 Texas hurricane,09/16/1941,09/27/1941,125,7.5M,7
+1941 Florida hurricane,10/03/1941,10/13/1941,120,675K,10
+1942 Matagorda hurricane,08/21/1942,08/31/1942,115,26.5M,8
+1942 Belize hurricane,11/05/1942,11/11/1942,110,4M,9
+1943 Surprise Hurricane,07/25/1943,07/29/1943,105,17M,19
+1944 Great Atlantic hurricane,09/09/1944,09/16/1944,160,100M,400
+1944 Cuba–Florida hurricane,10/12/1944,10/24/1944,145,100M,318
+1945 Outer Banks hurricane,06/20/1945,07/04/1945,100,75K,1
+1945 Texas hurricane,08/24/1945,08/29/1945,115,20.1M,3
+1945 Homestead hurricane,09/12/1945,09/20/1945,130,60M,26
+1946 Florida hurricane,10/05/1946,10/14/1946,100,5.2M,5
+1947 Fort Lauderdale hurricane,09/04/1947,09/20/1947,145,110M,51
+1947 Florida–Georgia hurricane,10/09/1947,10/16/1947,105,42.7M,1
+1948 Bermuda–Newfoundland hurricane,09/04/1948,09/16/1948,130,400K,8
+September 1948 Florida hurricane,09/18/1948,09/26/1948,130,14M,13
+1948 Miami hurricane,10/03/1948,10/16/1948,125,12.5M,11
+1949 Florida hurricane,08/23/1949,08/31/1949,130,52M,2
+1949 Texas hurricane,09/27/1949,10/07/1949,110,6.7M,2
+Able,08/12/1950,08/24/1950,125,1.04M,11
+Baker,08/18/1950,09/01/1950,105,2.55M,38
+Dog,08/30/1950,09/18/1950,145,3M,31
+Easy,09/01/1950,09/09/1950,120,3.3M,2
+King,10/13/1950,10/20/1950,130,32M,11
+Able,05/15/1951,05/24/1951,90,0,0
+Charlie,08/12/1951,08/23/1951,130,75M,259
+How,09/28/1951,10/08/1951,100,2M,17
+1952 Groundhog Day tropical storm,02/03/1952,02/05/1952,70,0,0
+Able,08/18/1952,09/02/1952,100,2.75M,3
+Fox,10/20/1952,10/28/1952,145,10M,601
+Barbara,08/11/1953,08/16/1953,90,1.3M,9
+Carol,08/28/1953,09/08/1953,160,2M,5
+Florence,09/23/1953,09/26/1953,115,200K,0
+Alice,06/24/1954,06/26/1954,110,2M,153
+Carol,08/25/1954,09/01/1954,115,462M,72
+Edna,09/02/1954,09/15/1954,125,42.8M,29
+Hazel,10/05/1954,10/18/1954,130,382M,1191
+Alice,12/30/1954,01/06/1955,90,623K,0
+Connie,08/03/1955,08/15/1955,140,86M,77
+Diane,08/07/1955,08/23/1955,105,832M,184
+Ione,09/10/1955,09/21/1955,140,88M,7
+Hilda,09/10/1955,09/20/1955,120,120M,304
+Janet,09/21/1955,09/30/1955,175,65.8M,1023
+Betsy,08/09/1956,08/18/1956,120,50M,37
+Flossy,09/20/1956,10/03/1956,90,24.9M,15
+Greta,10/30/1956,11/06/1956,100,3.6M,1
+Audrey,06/25/1957,06/29/1957,125,150M,431
+Ella,08/30/1958,09/06/1958,110,200K,36
+Helene,09/21/1958,10/04/1958,150,11.4M,1
+Arlene,05/28/1959,05/31/1959,65,500K,1
+Cindy,07/05/1959,07/11/1959,75,75K,6
+Debra,07/23/1959,07/28/1959,85,7M,0
+Gracie,09/20/1959,10/02/1959,140,14M,22
+1960 Texas tropical storm,06/22/1960,06/29/1960,60,3.6M,18
+Abby,07/10/1960,07/16/1960,80,640K,6
+Brenda,07/28/1960,08/01/1960,70,5M,1
+Donna,08/29/1960,09/14/1960,145,980M,439
+Ethel,09/12/1960,09/17/1960,115,1.5M,1
+Anna,07/20/1961,07/24/1961,105,300K,1
+Carla,09/03/1961,09/17/1961,145,326M,43
+Debbie,09/06/1961,09/19/1961,90,50M,68
+Esther,09/10/1961,09/27/1961,160,6M,7
+Hattie,10/27/1961,11/01/1961,165,60.3M,319
+Alma,08/26/1962,08/30/1962,85,1M,1
+Daisy,09/29/1962,10/08/1962,105,1.1M,32
+Arlene,07/31/1963,08/11/1963,115,300K,0
+Cindy,09/16/1963,09/20/1963,65,12.5M,3
+Edith,09/23/1963,09/29/1963,100,46.6M,10
+Ginny,10/16/1963,10/29/1963,110,500K,3
+Abby,08/05/1964,08/08/1964,70,750K,0
+Cleo,08/21/1964,09/05/1964,150,187M,156
+Dora,08/28/1964,09/14/1964,130,280M,5
+Gladys,09/13/1964,09/24/1964,130,100K,1
+Hilda,09/28/1964,10/05/1964,140,126M,38
+Isbell,10/08/1964,10/19/1964,115,30M,7
+Betsy,08/27/1965,09/13/1965,140,1.42B,81
+Debbie,09/24/1965,09/30/1965,60,25M,0
+Alma,06/04/1966,06/14/1966,125,210M,93
+Inez,09/21/1966,10/11/1966,165,227M,1269
+Beulah,09/05/1967,09/22/1967,160,235M,59
+Doria,09/08/1967,09/21/1967,100,150K,3
+Abby,06/01/1968,06/13/1968,75,450K,6
+Gladys,10/13/1968,10/21/1968,100,18.7M,8
+Camille,08/14/1969,08/22/1969,175,1.42B,259
+Francelia,08/29/1969,09/04/1969,100,35.6M,271
+Martha,11/21/1969,11/25/1969,90,30M,5
+Becky,07/19/1970,07/23/1970,65,500K,1
+Celia,07/31/1970,08/05/1970,140,930M,28
+Dorothy,08/17/1970,08/23/1970,70,34M,51
+Felice,09/12/1970,09/17/1970,70,0,0
+1970 Caribbean–Azores hurricane,09/30/1970,10/22/1970,85,65.5M,22
+1970 Canada hurricane,10/12/1970,10/20/1970,105,1K,0
+Beth,08/10/1971,08/16/1971,85,5.1M,1
+Doria,08/20/1971,08/29/1971,65,148M,7
+Fern,09/03/1971,09/13/1971,90,30.2M,2
+Edith,09/05/1971,09/18/1971,160,25.4M,37
+Ginger,09/10/1971,10/07/1971,110,10M,1
+Irene–Olivia,09/11/1971,10/01/1971,115,1M,3
+Laura,11/12/1971,11/22/1971,70,0,1
+1972 Subtropical Storm Alpha,05/23/1972,05/29/1972,70,100K,2
+Agnes,06/14/1972,07/06/1972,85,2.1B,128
+Carrie,08/29/1972,09/05/1972,70,12.45M,4
+Delia,09/01/1973,09/07/1973,70,6M,2
+Fran,10/08/1973,10/12/1973,80,0,0
+1974 Subtropical Storm One,06/22/1974,06/27/1974,65,10M,3
+Alma,08/12/1974,08/15/1974,65,5M,51
+Carmen,08/29/1974,09/10/1974,150,162M,8
+1974 Subtropical Storm Four,10/04/1974,10/08/1974,50,600K,0
+Amy,06/27/1975,07/04/1975,70,0,1
+1975 Tropical Depression Six,07/28/1975,08/01/1975,35,8.8M,3
+Eloise,09/13/1975,09/24/1975,125,560M,80
+Belle,08/06/1976,08/15/1976,120,100M,3
+Anita,08/29/1977,09/04/1977,175,946M,11
+Babe,09/03/1977,09/09/1977,75,13M,0
+Amelia,07/30/1978,08/01/1978,50,110M,33
+Cora,08/07/1978,08/12/1978,90,0,1
+Debra,08/26/1978,08/29/1978,60,0,2
+Ella,08/30/1978,09/05/1978,140,0,0
+Greta–Olivia,09/13/1978,09/23/1978,130,26M,5
+1979 Tropical Depression One,06/11/1979,06/16/1979,35,27M,41
+Bob,07/09/1979,07/16/1979,75,20M,1
+Claudette,07/16/1979,07/29/1979,50,400M,2
+David,08/25/1979,09/08/1979,175,1.54B,2078
+Frederic,08/29/1979,09/15/1979,130,1.77B,12
+Elena,08/29/1979,09/02/1979,40,10M,2
+Henri,09/14/1979,09/24/1979,85,0,0
+Allen,07/31/1980,08/11/1980,190,1.57B,269
+Danielle,09/04/1980,09/07/1980,60,25M,3
+Jeanne,11/07/1980,11/16/1980,100,0,0
+Karl,11/25/1980,11/29/1980,85,0,0
+Arlene,05/06/1981,05/09/1981,60,0,0
+Dennis,08/07/1981,08/26/1981,80,28.5M,3
+1981 Tropical Depression Eight,08/26/1981,08/19/1981,35,56.2M,5
+Katrina,11/03/1982,11/08/1981,85,0,2
+Alberto,06/01/1982,06/06/1982,85,85M,23
+1982 Florida subtropical storm,06/18/1982,06/22/1982,70,10M,3
+Beryl,08/28/1982,09/06/1982,70,3M,3
+Chris,09/09/1982,09/13/1982,65,2M,0
+Alicia,08/15/1983,08/21/1983,115,3B,21
+Barry,08/23/1983,08/29/1983,80,0,0
+Diana,09/08/1984,09/16/1984,130,65.5M,3
+Fran,09/15/1984,09/20/1984,65,2.8M,32
+Isidore,09/25/1984,10/01/1984,60,1M,1
+Klaus,11/05/1984,11/16/1984,90,152M,2
+Lili,12/12/1984,12/24/1984,80,0,0
+Bob,07/21/1985,07/26/1985,75,20M,5
+Danny,08/12/1985,08/20/1985,90,100M,5
+Elena,08/28/1985,09/04/1985,125,1.3B,9
+Gloria,09/16/1985,10/04/1985,145,900M,14
+Juan,10/26/1985,11/03/1985,85,1.5B,12
+Kate,11/15/1985,11/23/1985,120,700M,15
+Bonnie,06/23/1986,06/28/1986,85,42M,5
+Charley,08/15/1986,08/30/1986,80,15M,15
+Danielle,09/07/1986,09/10/1986,60,10.5M,0
+Arlene,08/10/1987,08/23/1987,75,8K,0
+Emily,09/20/1987,09/26/1987,125,80.3M,3
+Floyd,10/09/1987,10/13/1987,75,500K,1
+1987 Tropical Depression Fourteen,10/31/1987,11/04/1987,35,1.8M,6
+Beryl,08/08/1988,08/10/1988,50,3M,1
+Chris,08/21/1988,08/30/1988,50,2.2M,6
+Florence,09/07/1988,09/11/1988,80,2.9M,1
+Gilbert,09/08/1988,09/19/1988,185,2.98B,318
+Joan–Miriam,10/10/1988,11/02/1988,70,2B,334
+Keith,11/17/1988,11/26/1988,70,7.3M,0
+Allison,06/24/1989,07/07/1989,50,560M,11
+Chantal,07/30/1989,08/03/1989,80,100M,13
+Dean,07/31/1989,08/08/1989,105,8.9M,0
+Gabrielle,08/30/1989,09/13/1989,145,0,9
+Hugo,09/10/1989,09/25/1989,160,11B,67
+Jerry,10/12/1989,10/16/1989,85,70M,3
+Bertha,07/24/1990,08/02/1990,80,3.91M,9
+Diana,08/04/1990,08/09/1990,100,90.7M,139
+Gustav,08/24/1990,09/03/1990,120,0,0
+Klaus,10/03/1990,10/09/1990,80,1M,11
+Marco,10/09/1990,10/13/1990,65,57M,12
+Bob,08/16/1991,08/29/1991,115,1.5B,17
+Grace,10/25/1991,10/30/1991,105,0,0
+1991 Perfect Storm,10/28/1991,11/02/1991,75,200M,13
+1992 Tropical Depression One,06/25/1992,06/26/1992,35,2.6M,4
+Andrew,08/16/1992,08/29/1992,175,27.3B,65
+Bonnie,09/17/1992,09/30/1992,110,0,1
+Arlene,06/18/1993,06/21/1993,40,60.8M,26
+Bret,08/04/1993,08/11/1993,60,35.7M,213
+Cindy,08/14/1993,08/17/1993,45,19M,4
+Emily,08/22/1993,09/06/1993,115,35M,3
+Gert,09/14/1993,09/26/1993,100,170M,116
+Alberto,06/30/1994,07/07/1994,65,1.03B,32
+Beryl,08/14/1994,08/19/1994,60,74.2M,5
+Debby,09/09/1994,09/11/1994,70,115M,9
+Florence,11/02/1994,11/08/1994,110,0,0
+Gordon,11/08/1994,11/21/1994,85,594M,1152
+Allison,06/03/1995,06/11/1995,75,1.7M,1
+Dean,07/28/1995,08/02/1995,45,500K,1
+Erin,07/31/1995,08/06/1995,100,700M,16
+Felix,08/08/1995,08/25/1995,140,3.63M,9
+Gabrielle,08/09/1995,08/12/1995,70,0,6
+Jerry,08/22/1995,08/28/1995,40,40M,6
+Luis,08/28/1995,09/12/1995,150,3.3B,19
+Marilyn,09/12/1995,09/30/1995,115,2.5B,13
+Opal,09/27/1995,10/06/1995,150,4.7B,63
+Roxanne,10/07/1995,10/21/1995,115,1.5B,29
+Tanya,10/27/1995,11/03/1995,85,0,1
+Arthur,06/17/1996,06/24/1996,45,1M,0
+Bertha,07/05/1996,07/18/1996,115,335M,12
+Cesar–Douglas,07/24/1996,08/06/1996,130,203M,113
+Edouard,08/19/1996,09/07/1996,145,20M,2
+Fran,08/23/1996,09/10/1996,120,5B,22
+Hortense,09/03/1996,09/16/1996,140,158M,39
+Josephine,10/04/1996,10/13/1996,70,130M,3
+Lili,10/14/1996,10/30/1996,115,662M,22
+Marco,11/16/1996,11/26/1996,75,8.2M,15
+Danny,07/16/1997,07/27/1997,80,100M,9
+Erika,09/03/1997,09/20/1997,125,10M,2
+Bonnie,08/19/1998,08/30/1998,115,1B,5
+Charley,08/21/1998,08/24/1998,70,50M,20
+Danielle,08/24/1998,09/08/1998,105,50K,0
+Earl,08/31/1998,09/08/1998,100,79M,3
+Frances,09/08/1998,09/13/1998,65,500M,1
+Georges,09/15/1998,10/01/1998,155,9.37B,604
+Hermine,09/17/1998,09/20/1998,45,85K,2
+Bret,08/18/1999,08/25/1999,145,15M,1
+Dennis,08/24/1999,09/09/1999,105,157M,6
+Floyd,09/07/1999,09/19/1999,155,6.5B,85
+Gert,09/11/1999,09/23/1999,150,1.9M,2
+Harvey,09/19/1999,09/22/1999,60,22.6M,0
+Irene,10/13/1999,10/24/1999,110,800M,3
+Jose,10/17/1999,10/25/1999,100,5M,3
+Katrina,10/28/1999,11/01/1999,40,9K,0
+Lenny,11/13/1999,11/23/1999,155,786M,17
+Alberto,08/03/2000,08/25/2000,125,0,0
+Beryl,08/13/2000,08/15/2000,50,27K,1
+Debby,08/19/2000,08/24/2000,85,735K,1
+Florence,09/10/2000,09/19/2000,80,0,3
+Gordon,09/14/2000,09/21/2000,80,10.8M,24
+Helene,09/15/2000,09/25/2000,70,16M,1
+Isaac,09/21/2000,10/04/2000,140,0,1
+Joyce,09/25/2000,10/02/2000,90,0,0
+Keith,09/28/2000,10/06/2000,140,319M,62
+Leslie,10/04/2000,10/12/2000,45,950M,3
+Allison,06/05/2001,06/20/2001,60,9B,41
+Barry,08/02/2001,08/08/2001,70,30M,2
+Chantal,08/14/2001,08/22/2001,70,4M,2
+Dean,08/22/2001,08/28/2001,70,7.7M,0
+Erin,09/01/2001,09/17/2001,120,0,0
+Gabrielle,09/11/2001,09/19/2001,80,230M,2
+Iris,10/04/2001,10/09/2001,145,250M,36
+Karen,10/12/2001,10/15/2001,80,1.4M,0
+Michelle,10/29/2001,11/06/2001,140,2.43B,48
+Arthur,07/14/2002,07/19/2002,60,0,1
+Bertha,08/04/2002,08/09/2002,40,200K,1
+Cristobal,08/05/2002,08/13/2002,50,0,3
+Fay,09/05/2002,09/11/2002,60,4.5M,0
+Gustav,09/08/2002,09/15/2002,100,340K,4
+Hanna,09/12/2002,09/15/2002,60,20M,3
+Isidore,09/14/2002,09/27/2002,125,1.28B,19
+Kyle,09/20/2002,10/14/2002,85,5M,1
+Lili,09/21/2002,10/04/2002,145,1.16B,15
+Ana,04/20/2003,04/27/2003,60,0,2
+Bill,06/29/2003,07/03/2003,60,50.5M,4
+Claudette,07/08/2003,07/17/2003,90,181M,3
+Erika,08/14/2003,08/20/2003,75,100K,2
+Fabian,08/27/2003,09/10/2003,145,300M,8
+Grace,08/30/2003,09/02/2003,40,113K,0
+Henri,09/03/2003,09/08/2003,60,19.6M,0
+Isabel,09/06/2003,09/20/2003,165,3.6B,51
+Juan,09/24/2003,09/29/2003,105,200M,8
+Kate,09/25/2003,10/10/2003,125,0,0
+Larry,10/01/2003,10/06/2003,65,53.6M,5
+Nicholas,10/13/2003,11/05/203,70,0,0
+Odette,12/04/2003,12/09/2003,65,8M,8
+Alex,07/31/2004,08/06/2004,120,7.5M,1
+Bonnie,08/03/2004,08/14/2004,65,1.27M,3
+Charley,08/09/2004,08/15/2004,150,16.9B,35
+Earl,08/13/2004,08/15/2004,50,0,1
+Frances,08/24/2004,09/10/2004,145,10.1B,50
+Gaston,08/27/2004,09/03/2004,75,130M,8
+Ivan,09/02/2004,09/25/2004,165,26.1B,124
+Jeanne,09/13/2004,09/29/2004,120,7.94B,3037
+Karl,09/16/2004,09/28/2004,145,0,0
+Matthew,10/08/2004,10/11/2004,45,305K,0
+Arlene,06/08/2005,06/14/2005,70,11.8M,2
+Bret,06/28/2005,06/30/2005,40,9.3M,3
+Cindy,07/03/2005,07/12/2005,75,71.5M,0
+Dennis,07/04/2005,07/18/2005,150,3.98B,88
+Emily,07/11/2005,07/21/2005,160,1.01B,22
+Gert,07/23/2005,07/25/2005,45,6M,1
+Irene,08/04/2005,08/18/2005,105,0,1
+Jose,08/22/2005,08/23/2005,60,45M,16
+Katrina,08/23/2005,08/31/2005,175,125B,1392
+Maria,09/01/2005,09/14/2005,115,3.1M,1
+Nate,09/05/2005,09/13/2005,90,0,2
+Ophelia,09/06/2005,09/23/2005,85,70M,3
+Rita,09/18/2005,09/26/2005,180,18.5B,120
+Stan,10/01/2005,10/05/2005,80,3.96B,1668
+Tammy,10/05/2005,10/06/2005,50,30M,10
+Vince,10/08/2005,10/11/2005,75,0,0
+Wilma,10/15/2005,10/27/2005,185,22.4B,52
+Beta,10/26/2005,10/31/2005,115,15.5M,9
+Gamma,11/14/2005,11/22/2005,50,18M,39
+Delta,11/22/2005,11/30/2005,70,364M,7
+Epsilon,11/29/2005,12/10/2005,85,0,0
+Zeta,12/30/2005,01/07/2006,65,0,0
+Alberto,06/10/2006,06/19/2006,70,420K,3
+Beryl,07/18/2006,07/21/2006,60,0,0
+Chris,08/01/2006,08/04/2006,65,0,0
+Debby,08/21/2006,08/26/2006,50,0,0
+Ernesto,08/24/2006,09/01/2006,75,500M,11
+Florence,09/03/2006,09/19/2006,90,200K,0
+Gordon,09/10/2006,09/24/2006,120,3.8M,0
+Helene,09/12/2006,09/24/2006,120,0,0
+Isaac,09/27/2006,10/02/2006,85,0,0
+2007 Subtropical Storm Andrea,05/09/2007,05/14/2007,60,0,6
+Barry,06/01/2007,06/05/2007,60,118K,1
+Chantal,07/31/2007,08/05/2007,50,24.3M,0
+Dean,08/13/2007,08/27/2007,175,1.66B,45
+Erin,08/15/2007,08/20/2007,40,248M,21
+Felix,08/31/2007,09/07/2007,175,720M,130
+Gabrielle,09/08/2007,09/11/2007,60,0,1
+Humberto,09/12/2007,09/14/2007,90,50M,1
+2007 Tropical Depression Ten,09/21/2007,09/22/2007,35,6.2M,0
+Lorenzo,09/25/2007,09/28/2007,80,92M,6
+Noel,10/28/2007,11/07/2007,80,580M,222
+Olga,12/11/2007,12/17/2007,60,45M,40
+Arthur,05/31/2008,06/06/2008,45,78M,5
+Bertha,07/03/2008,07/21/2008,125,0,3
+Cristobal,07/19/2008,07/23/2008,65,10K,0
+Dolly,07/20/2008,07/27/2008,100,1.6B,22
+Edouard,08/03/2008,08/06/2008,65,550K,6
+Fay,08/15/2008,08/29/2008,70,560M,36
+Gustav,08/25/2008,09/07/2008,155,8.31B,153
+Hanna,08/28/2008,09/12/2008,85,160M,537
+Ike,09/01/2008,09/15/2008,145,38B,214
+Kyle,09/25/2008,09/30/2008,85,57.1M,8
+Laura,09/29/2008,10/04/2008,60,0,0
+Marco,10/06/2008,10/07/2008,65,0,0
+Omar,10/13/2008,10/21/2008,130,80M,1
+Paloma,11/05/2008,11/14/2008,145,455M,1
+Ana,08/11/2009,08/16/2009,40,0,0
+Bill,08/15/2009,08/26/2009,130,46.2M,2
+Claudette,08/16/2009,08/18/2009,60,350K,2
+Danny,08/26/2009,08/29/2009,60,0,1
+Erika,09/01/2009,09/04/2009,50,35K,0
+Fred,09/07/2009,09/19/2009,120,0,0
+Grace,10/04/2009,10/07/2009,65,0,0
+Ida,11/04/2009,11/11/2009,105,11.4M,4
+Alex,06/25/2010,07/06/2010,110,1.52B,51
+2010 Tropical Depression Two,07/08/2010,07/10/2010,35,0,0
+Bonnie,07/22/2010,07/25/2010,45,1.36M,1
+Colin,08/02/2010,08/09/2010,60,0,1
+2010 Tropical Depression Five,08/10/2010,08/18/2010,35,1M,2
+Earl,08/25/2010,09/05/2010,145,45M,8
+Hermine,09/03/2010,09/10/2010,70,740M,52
+Igor,09/08/2010,09/23/2010,155,200M,4
+Julia,09/12/2010,09/28/2010,140,0,0
+Karl,09/14/2010,09/18/2010,125,3.9B,22
+Matthew,09/23/2010,09/28/2010,60,171M,126
+Nicole,09/28/2010,09/30/2010,45,245M,20
+Otto,10/06/2010,10/18/2010,85,22.5M,0
+Richard,10/20/2010,10/27/2010,100,80M,1
+Shary,10/27/2010,10/31/2010,75,0,0
+Tomas,10/29/2010,11/11/2010,100,463M,44
+Arlene,06/28/2011,07/01/2011,65,223M,18
+Bret,07/17/2011,07/23/2011,70,0,0
+Emily,08/02/2011,08/11/2011,50,5M,4
+Harvey,08/19/2011,08/22/2011,65,0,5
+Irene,08/21/2011,08/30/2011,120,14.2B,58
+Katia,08/29/2011,09/13/2011,140,157M,3
+Lee,09/02/2011,09/07/2011,60,2.8B,18
+Maria,09/06/2011,09/18/2011,80,1.3M,0
+Nate,09/07/2011,09/12/2011,75,0,4
+Ophelia,09/20/2011,10/07/2011,140,0,0
+Rina,10/23/2011,10/29/2011,115,2.3M,0
+Beryl,05/26/2012,06/02/2012,70,148K,1
+Debby,06/23/2012,06/30/2012,65,250M,5
+Ernesto,08/01/2012,08/10/2012,100,252M,12
+Helene,08/09/2012,08/18/2012,45,17M,2
+Isaac,08/21/2012,09/03/2012,80,3.11B,41
+Leslie,08/30/2012,09/12/2012,80,10.1M,0
+Nadine,09/10/2012,10/04/2012,90,0,0
+Rafael,10/12/2012,10/26/2012,90,2M,1
+Sandy,10/22/2012,11/02/2012,115,68.7B,233
+Andrea,06/05/2013,06/10/2013,65,86K,1
+Barry,06/17/2013,06/20/2013,45,0,5
+Chantal,07/07/2013,07/10/2013,65,10M,1
+Ingrid,09/12/2013,09/17/2013,85,1.5B,32
+Karen,10/03/2013,10/15/2013,65,18K,0
+Arthur,07/01/2014,07/09/2014,100,39.5M,2
+Bertha,08/01/2014,08/16/2014,80,0,4
+Dolly,09/01/2014,09/04/2014,50,22.2M,1
+Fay,10/10/2014,10/13/2014,80,3.8M,0
+Gonzalo,10/12/2014,10/20/2014,145,317M,6
+Ana,05/08/2015,05/12/2015,60,0,1
+Bill,06/16/2015,06/23/2015,60,100M,8
+Danny,08/18/2015,08/24/2015,125,0,0
+Erika,08/24/2015,09/03/2015,50,511M,35
+Fred,08/30/2015,09/06/2015,85,2.5M,9
+Joaquin,09/28/2015,10/15/2015,155,200M,34
+Kate,11/08/2015,11/13/2015,85,0,0
+Alex,01/12/2016,01/17/2016,85,0,1
+Bonnie,05/27/2016,06/09/2016,45,640K,2
+Colin,06/05/2016,06/08/2016,50,1.04M,6
+Hermine,08/28/2016,09/08/2016,80,550M,4
+Julia,09/13/2016,09/21/2016,50,6.13M,0
+Matthew,09/28/2016,10/10/2016,165,16.5B,603
+Nicole,10/04/2016,10/20/2016,140,15M,1
+Otto,11/20/2016,11/26/2016,115,192M,23
+Bret,06/19/2017,06/20/2017,50,3M,1
+Cindy,06/20/2017,06/24/2017,60,25M,2
+Emily,07/30/2017,08/02/2017,60,10M,0
+Franklin,08/07/2017,08/10/2017,85,15M,0
+Gert,08/12/2017,08/18/2017,110,0,2
+Harvey,08/17/2017,09/02/2017,130,125B,107
+Irma,08/30/2017,09/13/2017,180,77.2B,52
+Jose,09/05/2017,09/25/2017,155,2.84M,1
+Katia,09/05/2017,09/09/2017,105,3.26M,3
+Maria,09/16/2017,10/02/2017,175,91.6B,3059
+Nate,10/04/2017,10/11/2017,90,787M,48
+Ophelia,10/09/2017,10/18/2017,115,87.7M,3
+Philippe,10/28/2017,10/29/2017,40,100M,5
+2017 Potential Tropical Cyclone Ten,08/27/2017,09/03/2017,45,1.92M,2
+Alberto,05/25/2018,06/01/2018,65,125M,18
+Beryl,07/04/2018,07/17/2018,80,1M,0
+Chris,07/06/2018,07/17/2018,105,0,1
+Florence,08/31/2018,09/18/2018,150,24.2B,24
+Gordon,09/03/2018,09/08/2018,70,200M,3
+Kirk,09/22/2018,09/28/2018,65,440K,2
+Leslie,09/23/2018,10/16/2018,90,500M,17
+Michael,10/07/2018,10/16/2018,160,25.5B,74
+Barry,07/11/2019,07/19/2019,75,900M,3
+Dorian,08/24/2019,09/10/2019,185,5.1B,84
+Fernand,09/03/2019,09/05/2019,50,11.3M,1
+Humberto,09/13/2019,09/20/2019,125,25M,2
+Imelda,09/17/2019,09/19/2019,45,5B,7
+Karen,09/22/2019,09/27/2019,45,3.53M,0
+Lorenzo,09/23/2019,10/07/2019,160,367M,19
+Melissa,10/11/2019,10/14/2019,65,24K,0
+Nestor,10/18/2019,10/21/2019,60,150M,3
+Pablo,10/25/2019,10/29/2019,80,0,0
+Olga,10/25/2019,10/27/2019,45,400M,2
+Arthur,05/16/2020,05/21/2020,60,112K,0
+Bertha,05/27/2020,05/28/2020,50,130K,1
+Amanda and Cristobal,06/01/2020,06/12/2020,60,865M,46
+Fay,07/09/2020,07/12/2020,60,220M,6
+Hanna,07/23/2020,07/26/2020,90,1.2B,9
+Isaias,07/30/2020,08/05/2020,90,5.03B,17
+Laura,08/20/2020,08/29/2020,150,23.3B,81
+Marco,08/21/2020,08/26/2020,75,35M,0
+Paulette,09/07/2020,09/28/2020,105,50M,2
+Sally,09/11/2020,09/18/2020,110,7.3B,4
+Teddy,09/12/2020,09/24/2020,140,35M,3
+2020 Subtropical Storm Alpha,09/17/2020,09/19/2020,50,24.2M,1
+Beta,09/17/2020,09/25/2020,65,225M,1
+Gamma,10/02/2020,10/06/2020,75,100M,6
+Delta,10/04/2020,10/12/2020,140,3.09B,6
+Epsilon,10/19/2020,10/26/2020,115,0,1
+Zeta,10/24/2020,10/30/2020,115,4.4B,9
+Eta,10/31/2020,11/14/2020,150,8.3B,175
+Iota,11/13/2020,11/18/2020,155,1.4B,84
+Claudette,06/19/2021,06/23/2021,45,375M,4
+Danny,06/27/2021,06/29/2021,45,5K,0
+Elsa,06/30/2021,07/10/2021,85,1.2B,13
+Fred,08/11/2021,08/20/2021,65,1.3B,7
+Grace,08/13/2021,08/21/2021,120,513M,16
+Henri,08/15/2021,08/21/2021,75,700M,2
+Ida,08/26/2021,09/05/2021,150,75.3B,107
+Larry,08/31/2021,09/12/2021,125,80M,5
+Mindy,09/08/2021,09/11/2021,60,75.2M,23
+Nicholas,09/12/2021,09/20/2021,75,1.1B,2
+Alex,06/05/2022,06/07/2022,70,0,4
+Bonnie,07/01/2022,07/11/2022,115,25M,5
+Danielle,09/01/2022,09/15/2022,85,0,0
+Earl,09/02/2022,09/15/2022,110,0,2
+Fiona,09/14/2022,09/27/2022,140,3.09B,29
+Ian,09/23/2022,10/01/2022,160,113B,161
+Hermine,09/23/2022,09/26/2022,40,9.8M,0
+Julia,10/07/2022,10/10/2022,85,406M,35
+Nicole,11/07/2022,11/11/2022,75,1B,11
diff --git a/lab-p5/images/README.md b/lab-p5/images/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..1c55d5bc28109951607c70092c6ad7fc3071a006
--- /dev/null
+++ b/lab-p5/images/README.md
@@ -0,0 +1,3 @@
+# Images
+
+Images from lab-p5 are stored here.
diff --git a/lab-p5/images/table.png b/lab-p5/images/table.png
new file mode 100644
index 0000000000000000000000000000000000000000..2b2472d15b63c4c31bec11d975f912844c17571b
Binary files /dev/null and b/lab-p5/images/table.png differ
diff --git a/lab-p5/lab-p5.ipynb b/lab-p5/lab-p5.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..02f73bb0f4e603c5466b59721e76852d19d486b2
--- /dev/null
+++ b/lab-p5/lab-p5.ipynb
@@ -0,0 +1,3502 @@
+{
+ "cells": [
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "15d34763",
+   "metadata": {
+    "cell_type": "code",
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "# import and initialize otter\n",
+    "import otter\n",
+    "grader = otter.Notebook(\"lab-p5.ipynb\")"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "aa4a9352",
+   "metadata": {
+    "editable": false,
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.509644Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.509644Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.726758Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.725729Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "import public_tests"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4d7af2a1",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "# Lab-P5: Looping Patterns and Hurricane API\n",
+    "\n",
+    "**WARNING:** Please go through Segment 1 of [Lab-P5](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/lab-p5) **before** you start to solve this notebook."
+   ]
+  },
+  {
+   "attachments": {
+    "table.PNG": {
+     "image/png": "iVBORw0KGgoAAAANSUhEUgAAAuwAAACcCAYAAAAknf4UAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAEnQAABJ0Ad5mH3gAADWrSURBVHhe7Z3bjhU9c4bnouZormWO5z7mDuY0/AiBGERAhCAIIAIi/IgVhBAIkYgTznMbTnvX7U3Zrt63u99HKn10t9uuncs1i/UxZ//3f/8nIBAIBAKBQCAQyDYFDTsEAoFAIBAIBLJhQcMOgUAgEAgEAoFsWNCwQyAQCAQCgUAgGxY07BAIBAKBQCAQyIYFDTsEAoFAIBAIBLJhQcMOgUAgEAgEAoFsWNCwQyAQCAQCgUAgGxY07BAIBAKBQCAQyIYFDTsEAoFAIBAIBLJh2UzD/j//8z/kfQgEAjm6oD6uJ/D9vqSGeCLneHIUP1k7N9Ow//r1i7wPgUAgRxfUx/UEvt+X1BBP5BxPjuIna+dmGvafP3+S9yEQCOTogvq4nsD3+5Ia4omc48lR/GTt3EzD/v37d/I+BAKBHF1QH9cT+H5fUkM8kXM8OYqfrJ2rNezhd4++ffvmXUMgEAhEC+rjegLf70tqiCdyjidH8ZO1c5WGXTbr//Ef/+Hd++///m/vGgKBQCBaUB/XE/h+X1JDPJFzPDmKn6ydizfstlkPG/YvX7541xDIanJ7Kc7OzsT59Yl+DoEsLNXVR7OHQrm8JcZuXI5wNp2uz4l4XYpbYmztMjaetK+m9ddmci6xj7eSG0v6KRX3JWqatXPRht1t1sOG/Z///Kd3DYGsIydxfd406+fN5jy/FidyDASyrFRXH9VB7x/q+sA7F9cnZ1wFsuuz6XQtzmXjQdS628v6YsWRsfHUeRw3rG1Dd3nr3R8im8k5Yh9vSZb0k4rvSj2BtXOxhj1s1sOG/b/+67+8awhkFVEHWFOg1H/3eWBB6pPq6iN50N+Ky6ahqe1vrvZ7NtUZj7EyNp6phl2J+QForE83k3Mbb9iX9NOaDbu1c5GGnWrWw4b9w4cP3jUEsoaoTWk+Ibm9xNdiINuQ6upjpmGv7Wsxez2bZH074t8ijo1ntmFvZAq/bibnNt6wL+mnNRt2a+fsDXuqWZfijvvP//xP7xoCWV6ChmLjxQpyHKmuPhJ7p9YGcZ9n0zE/XZcyNp6lhl3n/ri/nd1Mzm38DFzST2s27NbOWRv2XLMuxR379u1b7xoCWVyi4lTnJ4KQ/Ul19VHtpaZB96TOH353eTaZr24csbaNjecSDftmco7cx41M8D39KWRJP+m4h75YpqZZO2dr2EvNuhR3fHgNgSwt6hPAoBBR9yCQpaW6+kh9MjfR93uXll2eTQdu2MfGc4mGfTM5R+3jDcmSflrzE3Zr5ywNO6dZl+K+8/LlS+8aAllUzAEW/wQtBf/zKWRdqa4+Jg76YrOzQdnn2XTcr8SMjWcph6fI8c3k3MYb9iX9tGbDbu2cvGHnNutS3Pf+/d//3buGQJaUdJE97sEG2Y5UVx9TB/0Enz4uLfs8m/Q/X1vj/1MwVsbGM9+QG7+O/FvZzeTcxhv2Jf20ZsNu7Zy0Ye/TrEtx3/23f/s37xoCWU7yRfao/5oCZDtSXX3c0Sfsuz2bVIyO92HE2Hhmc3iiH0g3k3Mbb9iX9NOaDbu1c7KGvW+zLsV9/+nTp941BLKYlL7PaQ42/M+nkLWkuvpIHfSVNoh7Ppt080l/WLHXX5w0Np6phl19sDPRObGZnNt4w76kn9Zs2K2dkzTsQ5p1Ke4c//qv/+pdQyBLiS60uaI0zV9zQiBDpbr6aJrzUGr8oXf/Z5P+2l8Ur53Wu7HxbH/ImdFfm8m5xD6WsoW9vKSf1mzYrZ2jG/ahzboUd55Hjx551xAIBALRgvq4nsD3+5Ia4omc48lR/GTtHNWwj2nWpbhzPXz40LuGQCAQiBbUx/UEvt+X1BBP5BxPjuIna+ek/9PpGLl//z55HwKBQI4uqI/rCXy/L6khnsg5nhzFT9bOzTTs9+7dI+9DIBDI0QX1cT2B7/clNcQTOceTo/jJ2rmZhv0f//gHeR8CgUCOLqiP6wl8vy+pIZ7IOZ4cxU/WzrM/f/6I//3f/xW/f/8Wv379Ej9//hQ/fvwQ379/F9++fRNfv34Vp9NJfPnyRXz+/Fl8+vRJfPz4UXz48EG8f/9evHv3Trx9+1a8fv1avHr1Sv1GphcvXojnz5+LZ8+eqX+O5smTJ+Lx48fqi/PyuzgPHjxQH/HLnxru3r0r7ty5I/7lX/4FAoFAIBAIBAKBBLKZT9ilMgAAAGLkhypgHeD7fVFDPJFzPI7iJ2snGnYAANg4OMDXA77fFzXEEznH4yh+snaiYQcAgI2DA3w94Pt9UUM8kXM8juInaycadgAA2Dg4wNcDvt8XNcQTOcfjKH6ydqJhBwCAjYMDfD3g+31RQzyRczyO4idrJxp2AADYODjA1wO+3xc1xBM5x+MofrJ2TtOw316Ks7NLcUs9YwoadgAAoEkeTH9vxMXZWVN/G7m4EX/N7Zp4c7Vt3blNgW/HX3FzMYVdU80zP1uPo2XpJk/55eqNueIxLOeOx1b9NCTmOaydoxr228tGKXtYoGEHAIBZoA8m08xNeDCswdabjmFNARr2rcKN51QMad622ohuja36aUjMc1g7hzfsp2txfnYurk/Nn/EJOwAAzAZ9ML0RV2dn4uKm7uN6603HbpunN1fNuX0hpkqfWuznxnMqhjRvu825iVndT4k9NCTmOayd+EoMAABsHPpgQsO+BLttntCwL8KQ5m23OTcxq/sJDXuHNfrvzUUzf/NnKYTT1Tj7XErgqOw8yuHBPRf3eWoMAADMSHQwhXWpEa9xd7/bbiQ8P2xdbOunqW1z10uvXtv1N1xXU01ByY7IriAmXjwyz6h5Zay9+JxdNT+++Xj6SXEmjZ414uXPDuNooeKpdG9s6Jvz8r1SLLJzJxiacxJvjBnnktWnsL9Lc7df4XLHKHF8wsgtLmP8xNEjZ2/0rBG7h9SzlI8tuXoQYO3cfsMujWktiT9RkmO8QmOc4Bqfmqc0t3a2u/nq+T4hAGA/0AdT4hN2cxB5B4C5F9ZOvwZq5qyXeu5uTHugbbimJhu8gh1qTHsd+kv6xr6fexbOY9dupI0F7WcvL4hzsRnUzBN/OrjXOFrS8WykZ85zY1GaJ2RMzpXiPsbW/NzGdmcxWud8bvVhqJ+4OV7yZTOomSfxCXvWn+G1XN/Vx8faWcUn7C7qXja4iaQJ3inf0w4NlpeDyAABAMBcUAeTrVHeoULUP4uqb84hFV5b5quXlL703Fsi9j3PDso/4Tua3LPSvIbiuUTkBfnOfuNoSTZ5ge6ce5xYqDGBQ8n3HIbmXMxU/RBFODeRO0Te5HOrH8P8NFQP7h4y6wULUDqk9nyItXMHDbtxYmO8J8578TuMe+anqWheJcOSCwAAhkA1GU3FIop+4jCSMBoJyWz1kvqEqoGae0tEvmfaEV63n/QR50fuGTlvuHikU/lcbCaK19txHC3UXqJ059zjxIIaQ83tMjTnOHGn1ubdK80dN7RqDvuhACe3ejLIT2w9Bu6hBrVeoEToz9yeD7F2Vt6wG4dSSRUmjTeGcS8ReAAAWBqqyWgqVtyw5+qWOlyCT9iJgbPVS85hukGGNk8pu9T95v3k324Ez8h5w8U9nXjnYjNRs07QLOw4jhZqL1G6c+6VY0GPoeZ2GZZzM/ZDdp7s3Oa60bMTJ8c5udWTQX5i6cHzZTNxvIca1HrBApSPJep+6KsAa2fdDTvp+NiplKPK9/r9lQUAAMwF1WTQNUrfC+umJKx56poxTjJNvaTHUHNvidj3PDuyduWahuAZa163cWCei947LfuNo4XaS5TunHvFWKjLZkwQaPI9h0E5N2c/xJlbjYmb1w5ObvVj2N5k6DFqD5n1/JdJH7eQ63VYOyv/hF07PkpG6l4uGQ3hPf1XFkEwZIBSTgcAgBmgmoymGJEHj/2rVq90qoPFv6fqXVBfJXPWS/VO+OmxrNcbrqnJBq9gh+cf6Qff+Z2vcs/UJTGvXKt9JzwHeediqknYaxwtyXgGunPutXaXfB84mZrbZVjOzdkPceY2Y0Jx5p26pxq6N8t6jNtDamxwU92z8xf2fIi1s/7vsBuHtcnRjA/fi95p4N6zh18rwXMAAJgb6mBqKpY6VMhPitQB4NQt4jAI66Rl3nppPqUyz6Xu6p0N11Xa92U7Qv+oazNein9eF56F8zQDPF+H/mOcixJ3XffRHuNoSTZ5ge6ce9anuVjYMS7U3C5Dc44Td2pt1r3i3Fo/31SjszNPObf4DPZTQ1GPEXsoOc5Zw33PfZfC2jlNwz6BUA07AACA1MEElmBrvqeaAcBnynjOFYsa97tugLtPti2p+1NwlLpo7UTDDgAAG+coB9MW2Zrv0bCPY8p4omHvsJ9Y++6Iv1oyJUepi9ZONOwAALBxjnIwbZGt+R4N+zimjCcadp/oayaNyK+kzMVR6qK1Ew07AABsnKMcTFsEvt8XNcQTOcfjKH6ydqJhBwCAjYMDfD3g+31RQzyRczyO4idr59mfP3/Uxe/fv8WvX7/Ez58/xY8fP8T379/Ft2/fxNevX8XpdBJfvnwRnz9/Fp8+fRIfP34UHz58EO/fvxfv3r0Tb9++Fa9fvxavXr0SL1++FC9evBDPnz8Xz549E0+fPhVPnjwRjx8/Fo8ePRIPHz4UDx48EPfv3xf37t0Td+/eFXfu3FENu9QDAoFAIBAIBAKBdIJP2AEAYOPIYg3WAb7fFzXEEznH4yh+snaiYQcAgI2DA3w94Pt9UUM8kXM8juInaycadgAA2Dg4wNcDvt8XNcQTOcfjKH6ydqJhBwCAjYMDfD3g+31RQzyRczyO4idrJxp2AADYODjA1wO+3xc1xBM5x+MofrJ2omEHAICNs+TBpH4ZzMWNmO/XnSyzxlTA9/uihibvKI3oWI7iJ2vniIb9VlwGv9Hq8pYaxxM07PWxZOEvraWeL/7b//6Km4vlfACOy3IHk/5V4vNupSXWmA74fl/U0OQdpREdy1H8ZO0c3LDfXp6L65Nz7/ZyVNOebdjfXOkfChJVRjVr8rkVooGKfmVupmK1Y0uNmNXr7KopgyG6MM75a3lTUL8eWMnEjSUadjTsYBlSBxOnroX1MVeT9Hx+PZu6dlJrKDZa59f0vaLgFwVnTENtvp+DwU3eBHHg+iir4wRrlMaoeIexMeuu0dOkGBzLpWDkTLS3iLHWzgm/EnMS1+fNYpe3xLOy0A27aYgyhmhj3QKkG2U32XRyXoguz8wYyol/b8RFM/bqqnmnUEy8pI/mWrthJ4ryxJCbeiZKa6nnmU0BQM1QBxOnrkX1MXvomnrrvD997YzXaO/JeRNzr1nn1/I9xy+8MZb6fD8H/Zu8aeLQx0e0jtOswc5dNzYqZvSaa9I/lkvRI2cYe8DaOel32G8vm8UnbNi7xDKGRUbT9/V7tsDoZAyLpD/GoueTY9XzgiPtHDetnuaBgl53CWjbpifa1DNSWks931gxAWAq4oOJUdfMIRuOSe4lM77bRjPUzmgNO9926/w6vrfz5fzCG9NSoe/noG+Tx/FxeUwfH9E6TrMGTw8/T/U7a8SqRN9YLkU5VunYU1g7J/+E/fz6RDwrS/477Gmj4wIYjCWKlMJ82uHedx2o/lxI0G68WdMbT28Mu24rzjtkANV4GXhzLUnZZOAkgm1w9dhYF42xyz5vRc9NHT7qnjs2UJK3bjBPM55ay6U8L51D4bx2nnZ98yx7P9DL06GRMAfaOaxQOmVtMWRyCeyL6GDi1DWixkl0XsX1Qd13c2iG2pl71jzdZJ1fxfceab90lMfU6Ps5GN7kjYhDDx9J8jqOWIOpRxdzsxaRs1tgeCyXIpUz+n7UHyawdk7WsJ+uz5ugBt9r7yFDG/YmtfRPf6qptX92kkslYtDwSsLEDa45xcQtPvHGixt2b7zC2GXXITaT3DgXFxd+YNVa6Q0UrxOjNmSzVufTUN/Y5/qdbt6wkCtdXT0T9uTXjdfR9vhrhZTnpXOIssGfR5O977yvdXXzTa7bXcvxU/iomEtgV0QHE6euEbmlIN+Nc4y1BnGtcpPMQ2IND3qPavS7Wh/7Zyf/J9e1YxXfe+T8YimNqdP3czC8yRsRB66PDHkdR6zB1MOeazfqLHLPmW0xPJZLkcoZsx+vrpTf1XmvhIhNg7VzkoZdN+tz/isxpY1ii4gWb1ifBHVe5BSTsGnSjZa9Dgukvo7j5uoX2imvm/nkGEeXUNcQrZf2RSe+D6g57CbVwwh9A1/64yniuHHXDQ+W0lrleekcCudV10SByt4P1kgfihTDfRQMkYOiOIN9MKhptLnl7RtzL3yXmo+1hhzm52uydhbzk96jHTrvbU3zhk2tq8Mqvvco+UVSGFOp7+dgeJM3Ig5MH1nyOo5Yo0+smmslK8SIy/BYLkUqZ8x+IutD3GdYO0c27OZ/NB3xybqVoQ27bU7to7ZZ7TKvueYkse8kTjHRaznvmTl1wxY0nuZZuwk86fTz1pXvqD/Luew6el7CFS2RXgRh8ZSoe63Nsc/1Ju7m9cdLbMIFEs6RWzcoHJZ4LZ/ivFa37Bh6Hkn2fkF3n+l8FM2hhMh1UD3DmkaJ32hJubqK652XX5aJaye5hsc26/wqvvdI+6UjP6ZW38/B8CZvRBzYOaPJ6zhiDaYebb6Y+3mb12N4LJcilTNBf2hR8Unnw4iG3fw77OfX4kQ+7yeDGnaTTFmjExvCTVyVnNKWhKQ+MdWFiyhCat4gICk9QtQ4rZecS7+v7VfvKr39NUMovUKUzYEyflE3Pnf8kD9ozHivAMdxK66b8JO/VkxxXkIXSTgvNY8ke7+ge8e8PgL7JTqYUjmQOpAd4kYp8SEAYw2Vl82YlHS1M7GGB71HrR5r1flVfO+R8ItHbky9vp+D4U3eiDj0zJm8jiPWYOrhnTdurDfG8FguRSpngv7QkoiPtXNww67+RZiJmnUpgxr2VCJ5RtOOKTW1cWGNoefQ651dXQXrJgIUoW2V4+SmsbaptZoL+98cJdskakMG83ibVPkwf/jE48NYxHErrpvwkz8mpjzv8DGS7P2C7i0z+wjsl/hgonOgvPe7+mJJvzNsDfU82KucmmR1i/bZynV+Hd+7JPzikR5Ts+/nYHiTNyYO/XyU13HMGjw9/PPGXBf6gTUYHsulSMXK3A/zP/jByWLtHNiw60/Xx3xnPZRhX4nRyRcarZMrSD7XCabYhEnrwikmqc2mnS7X9NfQ44NgyLHBOnrti0ZHZ241rvkhoPFDtl40JPVyUD4JJvI3qfFtKI6u5HhnTu134l52XfteGD9/TAhn3tD/1LzUPJLs/cwazZ0md+31dD7i5hLYB9TBpHPHyYFiXaMOiVRt1fRfw+RmjzU6tlnn1/J9B2dcaszYNdb1/RwMb/LGxKGfj/I6jluDPYbK1UJfsTTDY7kUmZwxfWL3KO4PLNbOYQ376Vqcy4lJGfZ9dqph100JtYabNMbI5HONTtJOcoVEwikmWj86ge16oe8jm6g1zAbyn1k7yxsm6TdnPqVfoJy/SXWi+UNM8pkx0aa2eltpXg7XKa8rscVBi4xVKR5TzUvNI8neD/QK/e/l2mQ+itfJ+QfUTepgUnnh5IBf1/x8VxLmsMlHIrVb8mvERHu1sEaUx61so86v5XuOX4pjKvf9HPRt8iaJg4HrI0rHKdcojVHPw9jYsytXLBambyyXghuraFzCt9bOkf/T6XSS/4QdLI1OpLgop+4DAOZjroNpif1ce82A7/fFVps8lxp03AJH8ZO1Ew07INGFPvxkJv1XNgCA+ZjnYNL7ufRJ6DiWWGNe4Pt9UUOTd5RGdCxH8ZO1Ew07SGKbdldQ/AFYHhzg6wHf74sa4omc43EUP1k70bADAMDGwQG+HvD9vqghnsg5Hkfxk7UTDTsAAGwcHODrAd/vixriiZzjcRQ/WTvP/vz5oy5+//4tfv36JX7+/Cl+/Pghvn//Lr59+ya+fv0qTqeT+PLli/j8+bP49OmT+Pjxo/jw4YN4//69ePfunXj79q14/fq1ePXqlXj58qV48eKFeP78uXj27Jl4+vSpePLkiXj8+LF49OiRePjwoXjw4IG4f/++uHfvnrh79664c+eOatilHhAIBAKBQCAQCKQTfMIOAAAbRxZrsA7w/b6oIZ7IOR5H8ZO1Ew07AABsHBzg6wHf74sa4omc43EUP1k70bADAMDGwQG+HvD9vqghnsg5Hkfxk7UTDTsAAGwcHODrAd/vixriiZzjcRQ/WTvRsAMAwMbBAb4e8P2+qCGeyDkeR/GTtRMNOwAAbJwlD6Y3V2fi7OJGzPo7OBdYYyrg+31RQ5N3lEZ0LEfxk7VzeMN+uhbn3m/BPBfXJ2IcU9Cwg76UDh71/OqNuVqKv+LmIq8XAH1Z7mDSv85+3m2zxBrTAd/vixqavKM0omM5ip+snYMb9tP1pdegn67PRzXt2Yb9zZX+oSBRZVRjJp9bIZql6Nfsc+fiVLa/N+LCfUfJVVMaNwyh88VNXS0mGnZwFFIHU7aukXWpE2pr6PmI2pWqwUusYVirzq/ne91c596L7EiMs9Tm+zkY3OQVfKQojOH6KKvjVHsxo6uKURhjM35LfcLgWM5Oee+6tHmR6BusnRN+JeZWXDYLXt5Sz8pCN+ym+bFGpxKLKnKO4doZF6LLMzPGm29Yo2WLT6iaWrPnXKNQm8m1MQ2ts7Z/8c3YQ+8Qsqg4qOe5XQJAJVAHE6+uxej3iKbN1kCqLsp5GXNbpl5jzTq/ju/1tTsdVbdLNdCnPt/PQf8mr+wjzpg+OUPryNEjJs45ZszdONkfBphrLkX/WC6B9q/rKmrvtijfXoirqyZOib1h7ZyuYT/pr8hM2bB3CW4SLLKWvu8nqN4UYSMaJnGc1GU6/cyNNeE2vuan5M3sOzTsABSJDyZeXYtJNwnNy+pQdh91NS5VgymmXmPdOr+W7yPMGHfdUg30qND3c9C3ySv7iDOmX85QOnL0iIlzjjOPn1dmDm6eLUjfWK4GsXc1OgbyvorLMg37SVyfNwG9vCWe8ST/HXZuYkmCsalC6DWundP4pHWisM2j+m+Y/EaXVoigte9ZcdaNnjVC22J05m484zt3XsrcnG6KYB77uKR3aV71PGOLet68owuUmSOXK4ZwXjtPq495lr0f6OXp0EgYn3YOK5ROWVsMjFwC9REdTKy6FtMd1uaGQ+7ASO0VijnWiPdUMHa2Or8F3xvMulGNZO7xGn0/B8ObvLSPOhJjeuZMXkeOHppczuXm6WJuxmzgBy2K4bFcGGLvSnR8tG9z+9PaOaJhN016o4SUoZ+sWxnasDeppX/6U0lp/+wkl9oQRMJ6G8j89Ht1pe5Zm9KJ3pAIQAq1AeScgQ1uwDTGVidw8l1vHWrzp+z0iH/aTkIVEnMvPDDyuhnftmOkfeX4cGzuigpN7HNKl9gf4bzxPJrsfef9uGjKdbtrOZ5la9YWu04+l0CdRAcTq66F0PmuifPJJ/euy1xr6He1zfbPC9T5hvV9bzD1152irQuO0PPU6fs5GN7kcfZAYkzPnMnryNFDUhqXfm7PsBuVX+6Zsi2Gx3JhiL0bxn7mht2X20sZ2Jn+p9Ni4tkiosUb1qOY+M4ya6aSldhoumHq9HDn04U1nEuvG5mV0rmF8EfxHYkpmsUKWdjI2Q0cvltYk6W3JNbJFpXUq+p5YIP/Dm1nOG/K5uz9YI2yz10Stmb1HJpLoAYmaRrVO7lnPWsOxaxrmDptxBvG8od536sZZs1MTVvf9xJK9xh7BkX1plLfz8HwJo+zBxJjeuZMXkeOHg25nFOk59Fnm5FCzq3J8FguCb13wzN90YZdimraB34tZmjDbguUfdQ2zfZGj2JCF7lEwpv3U42Y30zFwVGYOdqN4Ymrs7E/HOPOl7LTwyQOaZCLHkcOi9Yp69b9IEPol9S7PG/o4xDK5/47dF6xYteQvW/fTxRkH6atOT3ZuQRqZHzTaHIssV+8XCKh94rPfGusVucb1ve9GcPcx9R8tfp+DoY3eT32QDimZ87kdeyhx8CYt/li9MuvtR7DY7kc5N5V+eD/oKr2VSJe1s5JG3b1TzueX4sT8awkgxp2k0zZIpAqot4GShST1LuKfPMbFkh1ndA/MYWB2niEP1IFwYOziRtyenmJxtTNoBNXipOopN68eUMfh1A+99+hdQ3npeaRZO/b94sx7mFrTk9WLoFaiQ6mVLxTdUDdT+VH5gf0FnqveMy1hrF1nTq/vu/VPi/W9g493m0G6vX9HAxv8hh7oODHvI868jqO3YuW9Dze2cKaax2Gx3IZUntX309LuE+snXV/wp5KJG9z0IVCf0qQazwbEpvJouegn3sJb68jRRNFzIXc6IQ/Crpawk9LaLRexY3M1c0lfIfSmzlv6OMQyuecuHDGSLL32/cLMe5ja1ZPRi6BaokPJk5d61C5QtyXpN7xKezrhtnWWLnOr+l7/S5he5LYxpp9PwfDm7yEjzxSY/rlTF7Hsh65nOtIz6Ped2Kl51s2ThyGx3J++u5dlQvh/jBYOwc27LfiMmjMZ/3FSYVNEBoZJmuUbKbYeJsnKkxm7qy3jV5RUMz9MOGJufSGDTaC1KV9N9bDJoI3H9n4UaR01vNan2i9gjEMH0W6yXfcScICTerNs1ndSyS4RD33J47eCf3frsOIXfZ+Zg0dA3vdw9aetii8XAK1Qh1MOk/ivRQ2Bcn7ilRtDSmMm3UNs0eCPNb2z1/n1/I9WYNd5NyBT+J36vb9HAxv8ji+TI9h50xDXseCHtmccyno6sXcjC3+ELAsw2M5L8W9S6DeSZzV1s6BDbv/L8RouRS35FieUA27NToWN2nMpk8+1+jN0gmVzNF6XG+bQuQJVeAS80XrhkEzG7B93sxDzefaWFKd8m3kk8gup9hYGLqFvg91I/XmzptIcEk4XhK/YwuRFumDcONQ80iy9wO9Qn97vubaWrQlXifnH1APqYNJ5YATb6qu6TGJg9bkHpHGCqpOaPHnm3+N9er8Or6n7LVi67Bfu7QEa1Xu+zno2+RxfMTzI89HEkrHfmvE8bFw5lFzhGeHPadWiFmKvrFcBs7ejVFxSZzX1s5JvxIzRvKfsAMAwHGZ62DSh3f6cJ+CJdaYE/h+X2yzyfOpQcctcBQ/WTvRsAMAwMaZ52DSnwSlPuWbhiXWmBf4fl/U0OQdpREdy1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7UTDDgAAGwcH+HrA9/uihngi53gcxU/WzrM/f/6oi9+/f4tfv36Jnz9/ih8/fojv37+Lb9++ia9fv4rT6SS+fPkiPn/+LD59+iQ+fvwoPnz4IN6/fy/evXsn3r59K16/fi1evXolXr58KV68eCGeP38unj17Jp4+fSqePHkiHj9+LB49eiQePnwoHjx4IO7fvy/u3bsn7t69K+7cuaMadqkHBAKBQCAQCAQC6QSfsAMAwMaRxRqsA3y/L2qIJ3KOx1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7UTDDgAAGwcH+HrA9/uihngi53gcxU/WTjTsAACwcXCArwd8vy9qiCdyjsdR/GTtRMMOAAAbZ8mD6c3VmTi7uBGz/g7OBdaYCvh+X9TQ5B2lER3LUfxk7ZysYb+9bIrA2aW4JZ5xBA17jr/i5oJbZPuMnR91OFy9MVdLsS0fADCW5Q4m/evs592yS6wxHfD9vqihyTtKIzqWo/jJ2jlNw366FudNEZitYX9z1cydbvxUU6jWN5Jq1LLz6ELmzlMsamq+C3Eze1eIhr0faNjBvkgdTH9vLrya5e21vzfiwn0WCLUt9XxXTTUMyNTOqP4aSW37IWtI1qrz6/mep2tWj4DafD8Hg5u8go8UmTF99klOx2geZ4IuF3J9SReHi/mbl1kZHMspKOQDZ19y9661c4KG/SSuz8/E5eVls+DUDbtpvDLG6OQlipxXUErz6OfubbspEv7TqIAt0bDXi/Jj1okAgBLUwaSLvVt/TO0r7Ldk02brpPc+swazfzgescZKdX4d3/N07adHuIak5DO77kpn7Az0b/LKPmL7kblPaB3NGpk5dD5oHVLNOGdMLfSP5RSUY83Zl332rrVzdMN+uj4XZ+fX4nQ7fcPeGWQcFBlC39fvdQWmPA+B+YQkm9Bo2IuoIsXxNwAgSXww6eIe1qew9sVkGjpT89xHnNrZpxEZtsa6dX4t30dEuvbUo0Lfz0HfJo9jG2fM2IY99DmFHXOTHKv1u7i5IXOnNvrGcgrKsebsy35719o5rmE/ya/CnIvrU/PnGRr2jj6bID02/yyAU0w4DbuZp/1prBF/eVqn0K7IzmBe93WqMKh7znhyveaeThgzhigupXlCyvP2sL8Z065vnmXvB/p7OjQSxradwwqlU9YWg/2rstwYAHoQHUxE86UwuRfdN+jcpWuWepbM1T41OM10a+Rqee5ZAKPOr+97Q6hrTz1q9P0cDG/yOLb18WOaWEc9b8lXOseahi+bGzIH6YaxNobHcgoSsebsy55719o5omHXX4U5vz7p65Ua9sZClXhuEtI/WUo4G85QKL6KNvnNdQg1h7nXbRRap3Bz+9fhZpNzdDZT73obk0gW9U5zr9Mj3tCceULK8/aw35tHk73vvB8flnLd7lqOn8JHbcE01619ji4A9CU6mFK1J7sn6b2miXPZJ/1uuy8coecZvoZ9d406v77vDaGuvfSo0/dzMLzJ49iWHsPfJ5SOJn5XVyq23Rx+/Lvzh9ZD6aDuMXNu4wyP5RQkYs3Zlz1riLVzcMOu/lUY+VUYe2+1hl1ii4iW5DDWhpOY+UpNVsrpitLGtQUvs7HChrW9zm+28N2YeM1uI3cMmSekPG8P+4lDIns/WKNfcRriIx2XyB3ZPAGgzCRNo3on9yyXo+W9btFNA7HfRq+xTp1f3/cSQtc+elTq+zkY3uRxbOPaL8OU2CcNsY6Uv8xazvnnfWAUxtzLCz0fGvYxJGLN2Zd99m6DtXNYw0415ys17Dbp7SN7TW8Y3mbSTWCuuBmyRVBvCHIp7z1aJ78RjK9bO4n1w7HtGmq8I86a6p2CDpx5Qsrz9rCfWCd7376fO0hbJvCRWSeaQwkjnwBIML5pNPnt7ecOL49J6H2agppvzBpr1vn1fZ/QtYcetfp+DoY3eRzbePZbUnGJdUw02CoHwtjYBt5/Rz1r10LDPp5ErDn7smcNsXYOaNj1V2HUhk1I+zWZHjKoYTfGlZK4o7yZehWSlNMlyeLdoN6zm4rWKdzIqY2t9ZXS/ZTtjzXze+/Ga6p3sjrw5glhz8uxn1gne9++n4uFYiIfFdcBYBjRwZTKtVRNStZESebDhZbyXndR+8KpSaPWWLnOr+37pK5sPer1/RwMb/I4e4AzpiPeJ5pYx0SDHeSA37C784fvJ+arjOGxnIL8nolSwN2XPWuItXPc/3TqyhqfsKeKRsoZhc2kk5t6L0HCuRq9Iai11DqFZpQc4zWUDoG93ljSF7Efijow5wkpzmuvB4yRZO+37xeK01Q+2kkRBNuDe4CHB7ZF5SlxX5J6x6e81zvMWGf/jlpj5Tq/pu/zuvL0qNn3czC8ycvbpuGMsZixzj6xxDomxgY9SBRrG6crOc7NgX2cVcNjOQWpWHP2Zb8aYu2su2E3RodJnC6Q6c2kHdWzkGQb9sScRAHU47p5bFFz7VL37LWcw50g0MMba33kjG/nD+95ig6bJ6Q8bw/7iXWy9zNrNHeaXLDXU/mIWqdBxscZA0BfqINJ56iTa+Zwjg7h1H1Fuib6JMbJuYPcjuveyDVWrvNr+Z6ja1mPun0/B8ObPI4vE2NY+6SD1DHqHeJzS8/pxsXoE4yz79J5WQ/DYzkF6Xzg1Ad2DWmwdm66YbcJHYubkCZpk88581BzWHEcGmI2UCiew6Mx1HzOpjLvK52dDa6CG14787o5E461idCObwarMc5L4bVkyDwhrHmbP7HsJ9bJ3vfWiPPAi9NUPmqI8i14DkBfUgeTyj8n16hir8dQzVWDyXtiCynKtdPfu/4zw+g1JOvV+XV8z9c1q0flvp+Dvk0ex0flMYx94pDSMVonCKx+TsfGH6rjQeVsTfSN5RRw8kHCrw/5MRJr53QN+0jJf8IOAADHZa6DiTrgp2aJNeYEvt8XazR5falBxy1wFD9ZO9GwAwDAxpnnYFriU7b6P8mD7/dFDU3eURrRsRzFT9ZONOwAALBxcICvB3y/L2qIJ3KOx1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7Tz78+ePuvj9+7f49euX+Pnzp/jx44f4/v27+Pbtm/j69as4nU7iy5cv4vPnz+LTp0/i48eP4sOHD+L9+/fi3bt34u3bt+L169fi1atX4uXLl+LFixfi+fPn4tmzZ+Lp06fiyZMn4vHjx+LRo0fi4cOH4sGDB+L+/fvi3r174u7du+LOnTuqYZd6QCAQCAQCgUAgkE7wCTsAAGwcWazBOsD3+6KGeCLneBzFT9ZONOwAALBxcICvB3y/L2qIJ3KOx1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7UTDDgAAGwcH+HrA9/uihngi53gcxU/WTjTsAACwcZY8mN5cnYmzixsx6+/gXGCNqYDv90UNTd5RGtGxHMVP1s4RDfutuDxrNn4g59cnYmxZ0LAz+XsjLqy/exbdsFBvqXAf6RBZ0tbSWur51RtztRR/xc3FceI9BcsdTPrX2c+bEkusMR3w/b6oock7SiM6lqP4ydo5smE/F9cn6ll/yTbsb650g5qoMqrpkM+tRI2ALlLumGTBKqzlYceeXTUrhOg1L26mbElMozOw2obN25KNY4miLsrXF2JSd05FT92W9HtpLfV88dMbDXtfUgfT35sLU4OMELFUMXbHZPyu5/PrGWcNBbN2UmsoCu+X7Zinzlfh+4Z2bM81FBv1/RwMbvKyevPs58aT1nG6NUpjVLzDGBv7p+1pxjE4lrPDiRV/z1g7hzfsp2txPnvDbg52axBhjS4kbgEyTmiTTc/hvmqLjz9dea0QL+mj8VqPaZN73JzhJiQ35UoUdVHFAg17X0prqeeMXAfrQh1Muv64eWdqnxNPFd9sfXSJPxDgrNGvdsZrcN4v26HncF/V74yv89v2vUH9zeuFuLpq3knu9/p8Pwf9mzyej2L7/TOhTzxjHXk+5qzBzl03j+zf7BO6rkn/WC4BJ1bcPaOxdm66Ye8Sy2yYyBL6vn7PLTABJvncxre8Voxd5ybaABK9CdCw8yjqgoZ9EKW11HNGroN1iQ8muhb4ta9nfTR1sRvOWcNeM2tntAbn/XXr/JZ9r9FrybHqeWq/V+j7Oejb5A3SO7K/TzyZOg5ag6eHf27od5Y6s/rQN5arQeyHiMwYa+fwhv32sgnwpbilng2Q/HfY0xslbkgYmyrrPP6m7JLcvOPpQW8Mu3b703oj4VK2iVL/lWPkvOavo1zx5ubO6+gYXmt78knV6mQlWMTqbudS4vlF481jbSXGtXCa4h6+zetn4unMoyVxQFG6Bbq4elC2ev6QEijO0zuYpxlPreVSnpfeD+G8dp52ffMsez/Qy9OhkTAP2zmsUDplbTGEe4kaszGig8nkV+CC1jZ7P/Zzur4pv7ljmWt0lGtntIZH+v0+drQY/cfW+a37Xue7rk05/9bo+zkY3uT10DuMX494Slg6hj7mrMHUo4u5sTl19q3M8FguTBgriswYa+fIhl0GspPLW2IcU4Y27E1q6Z/+VMNk/1xIrsQm0fA3pVso4zmJhp1a19xzx6nN0tyLdUj8ENBnXqfwetfUHAFyvLc2sflj3WOd9ZguRtqPvm4RSr9Mw97HB1n94viH+kZEulFzdu+r+YI4rOXX8rz0fqBs8OfRZO8772tdXR/KdbtrOX4KH3l7VmHsy+XeBogOptR+iPyifVCuj7Gv+GtY6FzpINbwyL3PtcOBqgktJV07Nu374FrlN5nLxBoe2/T9HAxv8rh6m3FuHHruJZaOoY85azD1sPX5RtXUQqxXZHgsFya7HwyZMdbOEf/TaSCmgZ/nX4kpbRRbRLRknWLHJg9ofjEJD3/dMNjrsECm5/Xfi687wjklPed17G6vOclEEq+t5gwm8telbIh1i0gVGkVPHzD084Zk126IntM2Woq2EvZw9e7r1/K8tG/DedU1kbPZ+8EaKX/RDPdRMEQOysd3AwxvGiXa7mx9pObrtYYkvQ8VRT8X3ufY0WLGJnO/tFbHln0f5nyyYa/U93MwvMnL6W2etT4Kal6vnOHoSPiYs0afvFJ2BGtsjOGxXJLSfpDkx1g7p2vYGzldn4uz82txIp6VZGjDrpvmLtHsdaoY6EQcU7g6wobdJr1uPHQAuiZEX5PTBpuIajw04ZySnvM6CeFuSl6zFBYlI87ilO7euokCFeoWkSo0inG+9deO46+eE01nC6Fbm4eEzrGt6/m1OK/VLTuGnkeSvV/Q3Wc6H0VzKEnl1jYY2jRy66PnJwtzjQ46VyzkGh7p97l2WNRa2ZjmdXXZrO/VGL8uqTUIH9fq+zkY3uTx9bY+as/VnnuppCPpY3bOlPVo88XcXytWJYbHcjnK+6E8xto5fcM+8Hvtgxp2k0xRs6mSMt4EHMcl1yLQm5IomGqNoLlObExFUHiVnvTAuGHvO69TtO31m6Ag0xi/eEU/9hWlu7duQt9Qt4hUoZGM9K2/trGpma8TP8YRGd3U3MEc5Hor+bU4L6GLJJyXmkeSvV/QvWNeH9VAdDClbHFz0Ywp18fED7ycNTzoXNFkfqhuSbzPtkOjYp6qFS05XX226nttZ1q6tev1/RwMb/L66G3G2rrPyRmHnI5JH/fIy5IeXt1MxHoLDI/lMnD2A2eMtXPShv32sll4yU/YU4lEJKV2Cifp+JuSatiblVRxPLu6Uv8NiyY1r7c57DW5vp7DL549501cqz/nkobc6LGvKN39dSkbwjEEicKmGedbb4yyM795IrK6NQS+i9dbz6/leYePkWTvF3RvmdlHNRAfTLQtXk1i1ke6jkkYa3ika2f6HZfE+yvX+Tp8r1HPg/1es+/nYHiT10dvM7ZQe1KxSemY9zFnDZ4eft206/Y8FxdgeCznh7MfuHvG2jm4Yb+99D9J15+uD/8fT4d9JUYnX1igtBO65NPJyCkkEv6mTBZCU+SkuBuD1IMoiEp/cv3cZmPOG27C9trYTdmjML52JrTJFt0LdCfXDYuDnCeIo4eyJ10wxvjW18/YGUof3eS1v6j3nFzPGd/6I7yX1du+18+vnHm1bwP9g3mpeSTZ+5k1dD7a6+l8FK/TIOOT8dEWoA4m7QPHFtNEdfXB+C2wTb9n8yRf78pruKTm4tbU1DiOHVI1ogYk4epUi+81ygfemnX7fg6GN3k5H3W+UJizx41Vn3hSOnJ8zFmDPYbKo9DOlRkey3nhxKrPnrF2jmjYZfBcGfdPPFINuzUoFjdpTEHp9dxKl7S8tXz0O/RzvSmIYJiN3ImzcQzqXTKK2hayYHPndTZhvCmNr1IZZDZ2u0YzLtSV0j1ex25+LdIe5cvgUPCI7OvebRnoW18/rZs/xOib0k+t66+l5nR0ceeL/LGiX6eal5pHkr0f6BXuQS+2k/koXifnn62QOpiUfY4tns8Uhfpo/EqEqKW0RuTPVsw6hTWK7yvWq/Nb9n2Iss/N58p9Pwd9mzyO3tQYyufceMY68nws4axRGqOeh3XR1uBcwi5M31guAydW/HhKrJ2TfiVmjOQ/YQdgGXThjQ+Q1H0AlmCug2mJvK5978D3+2KbTZ5PDTpugaP4ydqJhh0AB33AhZ+OmJ+GN/TJAjgW8xxMOq9Ln9qOY4k15gW+3xc1NHlHaUTHchQ/WTvRsAMQYJt2V3DogTXBAb4e8P2+qCGeyDkeR/GTtRMNOwAAbBwc4OsB3++LGuKJnONxFD9ZO9GwAwDAxsEBvh7w/b6oIZ7IOR5H8ZO18+zPnz/q4vfv3+LXr1/i58+f4sePH+L79+/i27dv4uvXr+J0OokvX76Iz58/i0+fPomPHz+KDx8+iPfv34t3796Jt2/fitevX4tXr16Jly9fihcvXojnz5+LZ8+eiadPn4onT56Ix48fi0ePHomHDx+KBw8eiPv374t79+6Ju3fvijt37qiGXeoBgUAgEAgEAoFAOsEn7AAAsHFksQbrAN/vixriiZzjcRQ/WTvRsAMAwMbBAb4e8P2+qCGeyDkeR/GTtlOI/wef2R2IgR60PgAAAABJRU5ErkJggg=="
+    }
+   },
+   "cell_type": "markdown",
+   "id": "86bd2236-e241-4d99-97d0-977b1784a2fe",
+   "metadata": {},
+   "source": [
+    "## Segment 2: Learning the API\n",
+    "\n",
+    "### Task 2.1: Examine the `hurricanes` CSV file\n",
+    "\n",
+    "The `project.py` file will allow you to access the dataset you'll use this week, `hurricanes.csv`. We generated this data file by writing a Python program to extract data from several lists of hurricanes over the Atlantic Ocean on Wikipedia (here is an [example](https://en.wikipedia.org/wiki/2022_Atlantic_hurricane_season)). You can take a look at the script `gen_csv.ipynb` yourself. At the end of the semester, you will be able to write it yourself.\n",
+    "\n",
+    "Open `hurricanes.csv` with Microsoft Excel or some other Spreadsheet viewer and look at the hurricanes in the dataset. The data shows:\n",
+    "\n",
+    "* `name` (the name of the hurricane),\n",
+    "* `formed` (the date of formation of the hurricane),\n",
+    "* `dissipated` (the date of dissipation of the hurricane),\n",
+    "* `mph` (the max wind speed in mph of the hurricane),\n",
+    "* `damage` (the damage in US dollars caused by the hurricane),\n",
+    "* `deaths` (the number of deaths caused by the hurricane).\n",
+    "\n",
+    "Often, we'll organize data by assigning numbers (called **indexes**) to different parts of the data (e.g., rows or columns in a table). In Computer Science, indexing typically starts with the number `0`; i.e., when you have a sequence of things, you'll start counting them from `0` instead of `1`. Thus, you should **ignore the numbers shown by your Spreadsheet Viewer to the left of the rows**. From the perspective of `project.py`, the indexes of `1804 New England hurricane`, `1806 Great Coastal hurricane`, and `1812 Louisiana hurricane` are `0`, `1`, and `2` respectively (and so on).\n",
+    "\n",
+    "For example, consider this example from `hurricanes.csv` as viewed from Microsoft Excel:\n",
+    "\n",
+    "![table.PNG](attachment:table.PNG)\n",
+    "\n",
+    "The **index** for the `1812 Louisiana hurricane` is `2` but it is the third entry in the dataset, and it is on **row** `4` of the table. Therefore, you must follow this convention for all the questions\n",
+    "asking for the value at a particular **index**."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "497e2cdb",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 2.2: Explore the API\n",
+    "Use the inspection process we learned in [Lab-P3](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/lab-p3) and [Lab-P4](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/lab-p4) to know more details of the 'project' API. In Lab-P3, we saw how to use `dir`, and `help` to learn the API. Run the following cells to explore the API:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "417e06d8",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.731739Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.730742Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.817002Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.815992Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# it is considered a good coding practice to place all import statements at the top of the notebook\n",
+    "# please place all your import statements in this cell if you need to import any more modules for this project\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "98c70d68-d464-4389-b611-6b8e935c1bfa",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.822022Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.821002Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.831670Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.830571Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# use the 'dir' function to learn more about the project API\n"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "a7db4a49",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Spend some time reading about each of the seven functions that don't begin with two underscores. For example, run this to learn about `count`:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "7639d9e2-ea47-47c1-8d42-40e6381ccfaa",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.835696Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.834695Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.843081Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.842094Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "help(project.count) "
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "fd1de2e2-0e4d-4a9a-8c68-ceac35d03d75",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Alternatively, you could run the following to just see the function's documentation:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4c4d19c5-ae7b-4ba6-a5d1-63a2af3093dd",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.847096Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.847096Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.852573Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.851560Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "print(project.count.__doc__)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "8b68dc0e-bfd1-4ad0-aa57-66f7067be401",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "You may also open up the `project.py` file directly to learn about the functions provided. E.g., you might see this:\n",
+    "\n",
+    "```python\n",
+    "def count():\n",
+    "    \"\"\"This function will return the number of records in the dataset\"\"\"\n",
+    "    return len(__hurricane__)\n",
+    "```\n",
+    "\n",
+    "You don't need to understand the code in the functions, but the strings in triple quotes (called *docstrings*) explain what each function does. As it turns out, all `project.count.__doc__` is providing you is the docstring of the `count` function.\n",
+    "\n",
+    "Try to learn other functions in `project.py`, by using `help` function. For example, you may try: "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "eec4b9f8-cb6b-49ce-b32a-714eb137e3da",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.856590Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.855569Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.863044Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.861033Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "help(project.get_name)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "1d1241a2-bad8-45de-bc52-884d74d2fb1b",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.867053Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.867053Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.874849Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.873838Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# now try getting help for the other functions in the `project` module\n"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "9897e54c-5ab0-4681-9982-f288feaea56b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 2.2.1: Getting familiar with `project.py`\n",
+    "\n",
+    "You will now demonstrate your familiarity with the functions inside the `project` module by answering a few simple questions. You must have already imported the `project` module to this notebook. Make sure you placed the `import` statememnt at the **top** of the notebook in the designated cell.\n",
+    "\n",
+    "**Remember:** In Computer Science, we start indexing at `0`."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "f8b0cdab",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 1:** What is the `name` of the hurricane at **index** `0`? "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "55fb64c1-678c-459a-99b7-2f52b70a5daa",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.879871Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.878851Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.886895Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.885881Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# we have done this for you!\n",
+    "name_idx0 = project.get_name(0)\n",
+    "\n",
+    "name_idx0"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "874b92fc",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q1\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4b8ae0c1",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 2:** What is the `name` of the hurricane at **index** `1`? "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a0e0e7c5",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.902419Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.901424Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.908696Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.907681Z"
+    },
+    "scrolled": true,
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... below with your code\n",
+    "name_idx1 = ...\n",
+    "name_idx1"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "0ffdd7e5",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q2\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "abfc8bef",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 3:** What is the speed in `mph` of the hurricane at **index** `7`? "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4fc72e0a",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.923740Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.922742Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.929928Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.928917Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... below with your code\n",
+    "mph_idx7 = ...\n",
+    "mph_idx7"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "786eb947",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q3\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "aa09dc8d",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 4:** What is the `damage` in dollars caused of the hurricane at **index** `5`? "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "9742a42d",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.942902Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.942902Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.949941Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.948928Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... below with your code\n",
+    "damage_idx5 = ...\n",
+    "damage_idx5"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "e7641918",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q4\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "38a2eaab-2165-4343-aa12-c48c20acb223",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Notice that the damage amount ends with a \"M\". In this dataset, \"K\" represents one thousand, \"M\" represents one million, and \"B\" represents one billion. In P5, you'll need to convert these strings to integers (e.g., `\"1.5K\"` will become `1500`, `\"2.55M\"` will become `2550000`)."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "74a940c8",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 5:** What is the `name` of the **last** hurricane in the dataset?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "134a4959-0ff4-4826-8e55-f2263e306f26",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.962800Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.962800Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.968977Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.967967Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# we have done this for you!\n",
+    "name_idx_last = project.get_name(project.count() - 1)\n",
+    "name_idx_last"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "0f86e61a",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q5\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "044d18e9",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Now, let us try to get the `name` at index `project.count()` instead. What happens? Why? Feel free to reach out to your TA/PM, if you are not sure."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a4d38fbf",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.981467Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.980466Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.986188Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.985177Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "project.get_name(project.count())"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "f2ac51fa-ff0a-434f-8c2e-e40e6476f0fa",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Segment 3: Working with strings"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "38756d68",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 3.1: Indexing / slicing Strings\n",
+    "\n",
+    "Stepping back from the Hurricane data, Tasks 3.1 and 3.2 introduce us to performing operations with strings. While this will be covered in more detail during Friday's lecture, we will cover the essentials now.\n",
+    "\n",
+    "We can think of a string as a sequence of characters. For example, the string `my_str = 'hello_world!'` can be written as...\n",
+    "\n",
+    "| index  | 0    | 1    | 2    | 3    | 4    | 5    | 6    | 7    | 8    | 9    | 10   | 11   |\n",
+    "| ------ | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- |\n",
+    "| string | h    | e    | l    | l    | o    | _    | w    | o    | r    | l    | d    | !    |\n",
+    "\n",
+    "... where we can then access specific characters of the string by an index, e.g. `my_str[0]` which returns `'h'` or `my_str[8]` which returns `'r'`.\n",
+    "\n",
+    "Furthermore, we can \"slice\" strings -- that is, get a particular section of characters. For example,\n",
+    "\n",
+    "- `my_str[1:5]` returns `'ello'`\n",
+    "- `my_str[:8]` returns `'hello_wo'`\n",
+    "- `my_str[5:]` returns `'_world!'`\n",
+    "- `my_str[:]` returns `'hello_world!'`\n",
+    "\n",
+    "Try running this in the cell below."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "cfaa8a24",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.990187Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.990187Z",
+     "iopub.status.idle": "2023-10-04T01:15:48.996820Z",
+     "shell.execute_reply": "2023-10-04T01:15:48.995811Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "my_str = 'hello_world!'\n",
+    "print(\"my_str[0] returns\", my_str[0])\n",
+    "print(\"my_str[8] returns\", my_str[8])\n",
+    "print(\"my_str[1:5] returns\", my_str[1:5])\n",
+    "print(\"my_str[:8] returns\", my_str[:8])\n",
+    "print(\"my_str[5:] returns\", my_str[5:])\n",
+    "print(\"my_str[:] returns\", my_str[:])"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "7319eb8c",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Notice that slicing is **inclusive** on the lower bound and **exclusive** on the upper bound. We can also leave out a bound to start from the beginning (e.g. `my_str[:6]`) or to keep going until the end (e.g. `my_str[8:]`). Lastly, a negative index will count **backwards** from the **end** of the string.\n",
+    "\n",
+    "Try running the cell below."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "cc68f988",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:48.999820Z",
+     "iopub.status.busy": "2023-10-04T01:15:48.999820Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.004572Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.004572Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "print(\"my_str[-1] returns\", my_str[-1])\n",
+    "print(\"my_str[-4:-1] returns\", my_str[-4:-1])"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "b4841367",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Your Turn!** Try slicing the below phone number! Can you extract the area code (first 3 digits), exchange code (middle 3 digits), and line number (last 4 digits) of the given phone number?"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4316c947",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 6:** What is the **last digit** of the phone number: `608-867-5309`?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a55bbfff",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.009765Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.008764Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.016244Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.015060Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "phone_number = \"608-867-5309\"\n",
+    "last_digit = phone_number[...]\n",
+    "\n",
+    "last_digit"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "38c3bce1",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q6\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "b5f6c9d3",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 7:** What is the **area code** (i.e., the first three characters) of the phone number: `608-867-5309`?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "e4b0a772",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.028375Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.028375Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.035774Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.034765Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "phone_number = \"608-867-5309\"\n",
+    "area_code = phone_number[:...]\n",
+    "\n",
+    "area_code"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4bfdf037",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q7\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "0f8b2276",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 8:** What is the **line number** (i.e., the last four characters) of the phone number: `608-867-5309`?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "81828521",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.049239Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.048239Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.055822Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.054811Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "phone_number = \"608-867-5309\"\n",
+    "line_number = phone_number[...:]\n",
+    "\n",
+    "line_number"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "77317008",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q8\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "7efa87ad",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 9:** What is the **exchange code** (i.e., middle three characters) of the phone number: `608-867-5309`?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "32d3afc4",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.069047Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.069047Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.074504Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.074504Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "phone_number = \"608-867-5309\"\n",
+    "exchange_code = phone_number[...:...]\n",
+    "\n",
+    "exchange_code"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "b685d93e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q9\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "73416a4d",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 10:** What is the **department code** (i.e., the letters at the start) of the course: `CS220`?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "5581131d",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.088183Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.087183Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.093432Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.093432Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "course = 'CS220'\n",
+    "dept_code = course[...]\n",
+    "\n",
+    "dept_code"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "2f2906f6",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q10\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "63f73dae",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 11:** What is the **course code** (i.e., the numbers at the end) of the course: `CS220`?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3219a1ba",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.106939Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.105938Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.114397Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.113391Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "course = 'CS220'\n",
+    "course_code = course[...]\n",
+    "\n",
+    "course_code"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3c28dcef",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q11\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "0529a58e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "After that short detour, we will now go back to working on the hurricane dataset.\n",
+    "\n",
+    "### Task 3.2: Calculating Damage Costs\n",
+    "\n",
+    "Question 4 showed us that damage costs are represented as strings with suffixes for thousands, millions, and billions.\n",
+    "\n",
+    "We can **index** the last character of these damages to find the suffix. We can then potentially use it to determine whether the suffix represents a thousand, million, or a billion."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "fbea9dee",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 12:** What is the **suffix** (i.e., the last character) of the cost `\"3.19B\"`?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "272cc9d6",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.127415Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.127415Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.133887Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.132880Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "cost = \"3.19B\"\n",
+    "suffix = cost[...]\n",
+    "\n",
+    "suffix"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "f25dbe47",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q12\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "1fc06e2a",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 13:** How many billions are there in the cost `\"3.19B\"`?\n",
+    "\n",
+    "Just as we found the suffix by **indexing**, we can also find the number by **slicing**. Answer the question by slicing the string to obtain the number of billions, and typecasting the string into a float."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3ed13461",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.146701Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.145701Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.152171Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.152171Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "cost = \"3.19B\"\n",
+    "billions = float(cost[...])\n",
+    "\n",
+    "billions"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a889adbe",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q13\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "89a8f7bf",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 3.3: Slicing dates\n",
+    "\n",
+    "Run the below cell which prints the formation and dissipation date of the first hurricane."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "ece79345",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.165266Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.165266Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.170093Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.170093Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "print(project.get_formed(0))\n",
+    "print(project.get_dissipated(0))"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "f4c5cc97",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "The dates are represented as a string in `mm/dd/yyyy` notation. Two digits are used to represent the month and day even when they can be represented with a single digit, that is, `'9/4/1804'` is represented as `'09/04/1804'`.\n",
+    "\n",
+    "To extract the month, we could run the following code..."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "425e901d",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.174098Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.174098Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.179687Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.179687Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "project.get_formed(0)[:2]"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "de4d2a52",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Notice, however, that this is the *string* `'09'`.\n",
+    "\n",
+    "Write the code to get this as the *int* (e.g. `9`)."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "c639454b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 14:** In which `month` did the hurricane at **index** `0` form?\n",
+    "\n",
+    "Your answer **must** be an `int` between `1` and `12`. You **must not** hardcode the answer, but use the appropriate function from the `project` module to find the date of formation of the hurricane."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "510b89a1",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.184696Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.183696Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.190716Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.189708Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "month_idx0 = ...\n",
+    "\n",
+    "month_idx0"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "6f587508",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q14\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "868d4b8c",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 3.4: Helper Functions for Month, Day, and Year\n",
+    "\n",
+    "The below functions will be useful in p5. Solve the below questions for getting the day, and year as an int. The function to get the month has already been done for you."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "1b18d8e1",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.203398Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.202398Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.207742Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.206735Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "def get_month(date):\n",
+    "    \"\"\"get_month(date) returns the month when the date is the in the 'mm/dd/yyyy' format\"\"\"\n",
+    "    return int(date[:2])"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "35b77034",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "You can confirm that `get_month` works by running the cell below."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "d08661ae",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.210743Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.210743Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.216183Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.216183Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "month = get_month(\"09/22/2023\")\n",
+    "\n",
+    "month"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e3a6218d",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 3.4.1: Define `get_year(date)`\n",
+    "\n",
+    "You must now define this function, which will take in the `date` as a `str` and return the `year` as an `int`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "9edfbae2",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.221192Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.220192Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.225922Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.224914Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "def get_year(date):\n",
+    "    \"\"\"get_year(date) returns the year when the date is the in the 'mm/dd/yyyy' format\"\"\"\n",
+    "    pass # replace with your code"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "a7aa9a4a",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 15:** What is the `year` in the date `\"09/22/2023\"`?\n",
+    "\n",
+    "You **must** answer this question by calling the `get_year` function."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "306f6044",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.229922Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.228923Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.234952Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.234952Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "year = ...\n",
+    "\n",
+    "year"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "5ed9e2ea",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q15\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "0248f120",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 3.4.2: Define `get_day(date)`\n",
+    "\n",
+    "You must now define this function, which will take in the `date` as a `str` and return the `day` as an `int`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "89b5f280",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.248971Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.248971Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.254102Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.253094Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "def get_day(date):\n",
+    "    \"\"\"get_day(date) returns the day when the date is the in the 'mm/dd/yyyy' format\"\"\"\n",
+    "    pass # replace with your code"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "d1a0314f",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 16:** What is the `day` in the date `\"09/22/2023\"`?\n",
+    "\n",
+    "You **must** answer this question by calling the `get_day` function."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "782abd17",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.258102Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.257103Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.264885Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.263877Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "day = ...\n",
+    "\n",
+    "day"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "0055da45",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q16\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "9a6e4ab0",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 3.5: Using Helper Functions\n",
+    "\n",
+    "Using the helper functions you made above, complete the following questions.\n",
+    "\n",
+    "**Hint:** You'll use these helper functions in combination with functions from the project module."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "821acf4f",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 17:** On what `day` did the hurricane at **index** `100` **form**?\n",
+    "\n",
+    "You **must** answer this question by calling the `get_day` function."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "671ea14e",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.277560Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.276559Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.282608Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.282608Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "day_formed_idx100 = ...\n",
+    "\n",
+    "day_formed_idx100"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "92a929fd",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q17\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "03e570fd",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 18:** In which `year` did the hurricane at **index** `200` **form**?\n",
+    "\n",
+    "You **must** answer this question by calling the `get_year` function."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "acd08955",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.296118Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.295117Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.302715Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.301707Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace with your code\n",
+    "year_formed_idx200 = ...\n",
+    "\n",
+    "year_formed_idx200"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "f7bff4ef",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q18\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "832ac84e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 19:** In which `month` did the hurricane at **index** `300` **dissipate**?\n",
+    "\n",
+    "You **must** answer this question by calling the `get_month` function."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "9562f391",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.315195Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.315195Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.321419Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.320410Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "month_diss_idx300 = ...\n",
+    "\n",
+    "month_diss_idx300"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "e619fa80",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q19\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "6cfd8eb1",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Segment 4: Looping"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "fc0e5794",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 4.1: `while` and `for` loops\n",
+    "\n",
+    "Run the below code and observe the output."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "6f4ff12e-2142-4fc7-b7e0-dc54ec0a641a",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.334958Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.333957Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.338932Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.338932Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "i = 0\n",
+    "while i < 5:\n",
+    "    print(i)\n",
+    "    i += 1"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "1c6ff122-f8b3-4909-acd3-269a60756464",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Equivalently, we can use `for` and `range(n)`. The `range(n)` function returns a sequence of numbers, from `0` to `n` but not including `n`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3cb57adb-af1a-4c16-affb-4744d626da41",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.342942Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.342942Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.347543Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.347543Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "for i in range(5):\n",
+    "    print(i)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "a29c53a1",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Now, we will try to use `while` and `for` loops to answer a few simple questions."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4c5d8826",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 20:** What is the sum of the numbers *0 to 25*, both inclusive?\n",
+    "\n",
+    "You **must** answer this with a `while` loop. Ask your TA/PM if you are not sure what to do."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "9192111e",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.352555Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.351555Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.359036Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.358027Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "i = 0\n",
+    "sum_while = ... # replace the ... with the correct initial value for the sum\n",
+    "while i ... 25: # replace the ... with the correct comparison operator\n",
+    "    sum_while += i \n",
+    "    i += 1\n",
+    "\n",
+    "sum_while"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "923aa8d9",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q20\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "6bf1f360",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 21:** What is the sum of the numbers *0 to 25*, both inclusive?\n",
+    "\n",
+    "You **must** answer this with a `for` loop. Ask your TA/PM if you are not sure what to do."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "ba9ce950",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.371863Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.371863Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.379503Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.378492Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "sum_for = ...\n",
+    "for i in range(...):\n",
+    "    sum_for += ...\n",
+    "\n",
+    "sum_for"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "58f5b99e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q21\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "f5359d22",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 4.2: Looping through hurricanes\n",
+    "\n",
+    "You have had some practice with simple looping structures. You will now loop through the hurricanes dataset.\n",
+    "\n",
+    "Run the below code and observe the output."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "27698c77",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.392699Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.391700Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.396560Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.396560Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "for idx in range(10):\n",
+    "    print(project.get_name(idx))"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "03c9cfb3",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "Can you make the code above display the **year of the formation** of the first 10 hurricanes? How about the **first 15** hurricanes? Please feel free to reach out to your TA/PM and ask them for help, if you face any issues.\n",
+    "\n",
+    "You are now ready to answer some interesting questions with loops."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "3ddb7960",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 22:** What is the **total** `deaths` caused by the **first** `10` hurricanes in the dataset?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3de665a3-f733-4acf-a1e5-18e836d4453f",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.401558Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.400556Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.408431Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.407436Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "total_deaths_first10 = ...\n",
+    "for idx in range(...):\n",
+    "    total_deaths_first10 += ...\n",
+    "\n",
+    "total_deaths_first10"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "8f0abf01",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q22\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "593615d7",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 23:** What is the **average** speed (in `mph`) of **all** the hurricanes in the dataset?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "26e575ac-4dc5-49de-a6d8-abe9a5cb2ea6",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.421940Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.421940Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.430503Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.429493Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "sum_wind_speed = ...\n",
+    "for idx in range(project.count()):\n",
+    "    sum_wind_speed += ...\n",
+    "average_wind_speed = sum_wind_speed/project.count()\n",
+    "\n",
+    "average_wind_speed"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "bce409d9",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q23\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "774aefcd-f260-4554-a1fc-02076458ccb7",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 4.3: Filtering\n",
+    "\n",
+    "You will now *filter* the data using an `if` condition as you loop through the dataset."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "d027146a",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 24:** How many hurricanes caused **more than** `1000` deaths in the dataset?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "652b3f89-4ad1-44a2-bbbc-b21fea7009cf",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.444877Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.444877Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.453368Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.452357Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "num_hurr_1000_deaths = ...\n",
+    "for idx in range(...): # loop through ALL hurricanes in the dataset; do NOT hardcode the number here\n",
+    "    if ...: # replace ... with a Boolean expression\n",
+    "        num_hurr_1000_deaths += 1\n",
+    "\n",
+    "num_hurr_1000_deaths"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "ce2ac759",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q24\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "6496187e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 25:** How many hurricane `names` **start** with the letter *D* in the dataset?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "5ffb3cb6-f420-4653-8ef4-bb5140c19114",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.467422Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.467422Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.477570Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.476559Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'num_hurr_d'\n",
+    "# TODO: initialize the variable 'num_hurr_d'\n",
+    "# TODO: loop through all hurricanes in the dataset\n",
+    "# TODO: update the value of 'num_hurr_d' only if\n",
+    "#       the name of the hurricane at the current idx starts with 'D'\n",
+    "        \n",
+    "# display the variable 'num_hurr_d' here\n",
+    "num_hurr_d"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "46d5ca68",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q25\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e0a161cd-1012-4a92-b6b1-9f0095991720",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 4.4: Maximization/Minimization\n",
+    "\n",
+    "You will now find the maximum/minimum using loops."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "73e52ef8",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 26:** What is the `name` of the hurricane which has the **fastest** wind speed (in `mph`)?\n",
+    "\n",
+    "`None` is a Python keyword which denotes nothing. At the beginning of this loop, by saying `fastest_idx = None`, we make no assumptions about what the fastest hurricane is. Inside the loop, if the `fastest_idx` is `None`, we know that is our first (and currently fastest) hurricane.\n",
+    "\n",
+    "Note that in the skeleton code below, we break ties in favor of the hurricane that **appears first** in the dataset."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "02eb02f0-34b1-4d6a-b2d0-13ce09241cc3",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.491165Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.491165Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.500304Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.499293Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "fastest_idx = None\n",
+    "max_speed = 0\n",
+    "for idx in range(project.count()):\n",
+    "    current_speed = ...\n",
+    "    if fastest_idx == None or current_speed > max_speed:\n",
+    "        max_speed = ...\n",
+    "        fastest_idx = idx\n",
+    "fastest_name = project.get_name(fastest_idx)\n",
+    "        \n",
+    "fastest_name"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "5fab8b2b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q26\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "3c3b8357",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 27:** What is the `name` of the hurricane which has the **slowest** wind speed (in `mph`)?.\n",
+    "\n",
+    "You **must** break ties in favor of the hurricanes that appear **first** in the dataset."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "05cc4ff6-c0a3-468a-9ee8-6d04cb91682a",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.513457Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.512457Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.520544Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.520544Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "slowest_idx = None\n",
+    "min_speed = 0\n",
+    "for idx in range(...):\n",
+    "    current_speed = ...\n",
+    "    if ... or ...:\n",
+    "        min_speed = ...\n",
+    "        slowest_idx = ...\n",
+    "slowest_name = ...\n",
+    "        \n",
+    "slowest_name"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "f0793bab",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q27\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "898f5290",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Task 4.5: More Filtering"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "859ed5e5",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "You will now create a function that takes in two years `start_year` and `end_year`, and return the number of hurricanes that were formed between these two years (both years inclusive).\n",
+    "\n",
+    "You **must** use the `get_year` function you defined above to find the year of formation of each hurricane. "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "b746eeae",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.534245Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.534245Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.539477Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.539477Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "def count_hurricanes_between(start_year, end_year):\n",
+    "    # replace the ... with your code\n",
+    "    num_hurricanes = 0\n",
+    "    for idx in range(...):\n",
+    "        year_formed = ... # TODO: find the year of formation of the hurricane at idx\n",
+    "        # hint: to find year_formed, you first find the date of formation and pass that value to\n",
+    "        #       another function to find the year from that date.\n",
+    "        #       note that you can perform both computations in a single line by passing the\n",
+    "        #       value returned by one function as an argument to another function.\n",
+    "        if ...: # TODO: evaluate if hurricane at idx was formed between start_year and end_year\n",
+    "            num_hurricanes += 1\n",
+    "    return num_hurricanes"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "be402552",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 28:** How many hurricanes were `formed` between *1980 and 2002*, both inclusive?\n",
+    "\n",
+    "You **must** answer this question by calling the `count_hurricanes_between` function."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "415fbeeb",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.544488Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.543486Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.550125Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.550125Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "hurr_between_1980_2002 = ...\n",
+    "\n",
+    "hurr_between_1980_2002"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4392deb9",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q28\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "1317449e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 29:** How many hurricanes were `formed` between *1901 and 2000*, both inclusive?\n",
+    "\n",
+    "You **must** answer this question by calling the `count_hurricanes_between` function."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "fa9c6318",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.563558Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.562557Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.569331Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.569331Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# replace the ... with your code\n",
+    "hurr_between_1901_2000 = ...\n",
+    "\n",
+    "hurr_between_1901_2000"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "d799964a",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q29\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "30483ffa",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Segment 5: Fixing Indentation\n",
+    "\n",
+    "### Task 5.1: Debugging\n",
+    "\n",
+    "The `def`, `if`, and `for` statements above use different levels of **indentation** to encode the meaning of the statement. This means, by just changing the indentation level of some code, you might get invalid code that has a **syntax error** or you might get **valid code** that gives a different, **incorrect result**. Therefore, it is an important skill to decide on the **correct indentation level** for and to recognize a wrong indentation level in a piece of code.\n",
+    "\n",
+    "For each of the following questions, you will be provided with a function which has either **syntax/semantic errors** because of **bad indentation**. You **must** fix the indentation to make the functions work as intended.\n",
+    "\n",
+    "**Warning:** You **must** fix the errors **only by changing the indentation**, and **not** by writing any code of your own. \n",
+    "\n",
+    "**Hint:** You can increase the indentation simultaneously for a number of lines by selecting them and hitting the *Tab* key on your keyboard. Similarly, you can decrease their indentation by holding the *Shift* key and then hitting *Tab* on your keyboard."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "b29bb591",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 30:** Fix the indentation errors in the function below."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "cd45719d",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.582887Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.582887Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.589455Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.588447Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# fix any indentation errors you find in the code below\n",
+    "\n",
+    "def count_slower_than(mph):\n",
+    "    '''count_slower_than(mph) returns the number of hurricanes\n",
+    "    with a maximum wind speed less than the given speed'''\n",
+    "    num_hurrs = 0\n",
+    "    for idx in range(project.count()):\n",
+    "        if project.get_mph(idx) < mph:\n",
+    "        num_hurrs += 1\n",
+    "    return num_hurrs"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "8d9361d7",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.593457Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.592456Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.599389Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.599389Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "count_slower_than_200 = count_slower_than(200)\n",
+    "\n",
+    "count_slower_than_200"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "8e87b037",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q30\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "b73e2250",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 31:** Fix the indentation errors in the function below.\n",
+    "\n",
+    "Unlike the previous function definition, this one has a **semantic** error, i.e., the code executes without any syntax errors, but the logic behind the code is incorrect. Fix the indentation, so that the code behaves as it is supposed to.\n",
+    "\n",
+    "**Hint:** If you are having trouble identifying the error, you should try tracing through the code using the test examples below. Manually open [`hurricanes.csv`](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/lab-p5/hurricanes.csv), then go through the function line by line to confirm that it behaves as it ought to."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a2e3fd2e",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.613209Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.613209Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.619177Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.618169Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# fix any indentation errors you find in the code below\n",
+    "\n",
+    "def count_number_of(name):\n",
+    "    '''count_number_of(name) returns the number of hurricanes\n",
+    "    in the dataset with the given name'''\n",
+    "    num_hurrs = 0\n",
+    "    for idx in range(project.count()):\n",
+    "        if project.get_name(idx) == name:\n",
+    "            num_hurrs += 1\n",
+    "        return num_hurrs"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "d457e67f",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.622177Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.622177Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.629316Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.628308Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "count_number_of_harvey = count_number_of(\"Harvey\")\n",
+    "\n",
+    "count_number_of_harvey"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "e3d97a39",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q31\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "d1e77ead",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 32:** Fix the indentation errors in the function below.\n",
+    "\n",
+    "This function definition has a **semantic** error, i.e., the code executes without any syntax errors, but the logic behind the code is incorrect. Fix the indentation, so that the code behaves as it is supposed to."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "be5204fc",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.642240Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.641240Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.648054Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.647045Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# fix any indentation errors you find in the code below\n",
+    "\n",
+    "def count_deadlier_than(hurr_idx):\n",
+    "    '''count_deadlier_than(hurr_idx) returns the number of \n",
+    "    hurricanes in the dataset which caused more deaths \n",
+    "    than the hurricane with the given index'''\n",
+    "    num_hurrs = 0\n",
+    "    for idx in range(project.count()):\n",
+    "        if project.get_deaths(idx) > project.get_deaths(hurr_idx):\n",
+    "            num_hurrs += 1\n",
+    "            return num_hurrs"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "0d173f83",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.651054Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.651054Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.658291Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.657281Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "count_deadlier_than_0 = count_deadlier_than(0)\n",
+    "\n",
+    "count_deadlier_than_0"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "b52c1c6f",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q32\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4c883a5b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 33:** Fix the indentation errors in the function below.\n",
+    "\n",
+    "This function definition has a **semantic** error, i.e., the code executes without any syntax errors, but the logic behind the code is incorrect. Fix the indentation, so that the code behaves as it is supposed to.\n",
+    "\n",
+    "Note that for calls to the function below to execute, you must have correctly defined the function `get_year` in Task 3.4.1."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "fe270fc5",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.670926Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.670926Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.677476Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.676469Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# fix any indentation errors you find in the code below\n",
+    "\n",
+    "def find_average_mph(year):\n",
+    "    '''find_average_mph(year) returns the average speed of \n",
+    "    hurricanes in the dataset which were formed in the\n",
+    "    given year'''\n",
+    "    total_speed = 0\n",
+    "    num_hurrs = 0\n",
+    "    for idx in range(project.count()):\n",
+    "        if get_year(project.get_formed(idx)) == year:\n",
+    "            total_speed += project.get_mph(idx)\n",
+    "        num_hurrs += 1\n",
+    "    return total_speed/num_hurrs"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "d63aa503",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.680477Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.680477Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.686438Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.686438Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "find_average_mph_2022 = find_average_mph(2022)\n",
+    "\n",
+    "find_average_mph_2022"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "58a1867b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q33\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "a5543d06",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 34:** Fix the indentation errors in the function below.\n",
+    "\n",
+    "This function definition has a **semantic** error, i.e., the code executes without any syntax errors, but the logic behind the code is incorrect. Fix the indentation, so that the code behaves as it is supposed to."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "072bfeb4",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.700565Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.700565Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.706734Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.705725Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# fix any indentation errors you find in the code below\n",
+    "\n",
+    "def check_more_than_one(name):\n",
+    "    '''check_more_than_one(name) returns True if \n",
+    "    there is more than one hurricane in the dataset\n",
+    "    with the given name and False otherwise'''\n",
+    "    num_hurrs = 0\n",
+    "    for idx in range(project.count()):\n",
+    "        if project.get_name(idx) == name:\n",
+    "            num_hurrs += 1\n",
+    "        if num_hurrs > 1:\n",
+    "            return True\n",
+    "        else:\n",
+    "            return False"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3252d6f1",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.710735Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.709736Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.715989Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.715989Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "check_more_than_one_maria = check_more_than_one(\"Maria\")\n",
+    "\n",
+    "check_more_than_one_maria"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "b5ee956c",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q34\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "f926a821",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 35:** Fix the indentation errors in the function below.\n",
+    "\n",
+    "This function definition has a **semantic** error, i.e., the code executes without any syntax errors, but the logic behind the code is incorrect. Fix the indentation, so that the code behaves as it is supposed to."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "f0e986e6",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.729999Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.729999Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.736362Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.735351Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# fix any indentation errors you find in the code below\n",
+    "\n",
+    "def find_deadliest():\n",
+    "    '''find_deadliest() returns the most number of deaths\n",
+    "    caused by any hurricane in the entire dataset'''\n",
+    "    deadliest_idx = None\n",
+    "    max_deaths = 0\n",
+    "    for idx in range(project.count()):\n",
+    "        curr_deaths = project.get_deaths(idx)\n",
+    "        if deadliest_idx == None or curr_deaths > max_deaths:\n",
+    "            deadliest_idx = idx\n",
+    "        max_deaths = curr_deaths           \n",
+    "    return max_deaths"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "d8350d81",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.740363Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.740363Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.747015Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.746007Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "deadliest_hurr = find_deadliest()\n",
+    "\n",
+    "deadliest_hurr"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "7c84f491",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q35\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "207b1317",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 36:** Fix the indentation errors in the function below.\n",
+    "\n",
+    "This function definition has a **semantic** error, i.e., the code executes without any syntax errors, but the logic behind the code is incorrect. Fix the indentation, so that the code behaves as it is supposed to."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "85be1c35",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.760499Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.760499Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.767422Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.766415Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# fix any indentation errors you find in the code below\n",
+    "\n",
+    "def count_more_common(name1, name2):\n",
+    "    '''count_more_common(name1, name2) returns which of \n",
+    "    the two given hurricane names appears more often\n",
+    "    in the dataset, and returns \"Draw\" if they appear\n",
+    "    an equal number of times'''\n",
+    "    num_name1 = 0\n",
+    "    num_name2 = 0\n",
+    "    for idx in range(project.count()):\n",
+    "        if project.get_name(idx) == name1:\n",
+    "            num_name1 += 1\n",
+    "        elif project.get_name(idx) == name2:\n",
+    "            num_name2 += 1\n",
+    "        if num_name1 > num_name2:\n",
+    "            return name1\n",
+    "        elif num_name1 < num_name2:\n",
+    "            return name2\n",
+    "        else:\n",
+    "            return \"Draw\""
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "b3e6adb3",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:15:49.770422Z",
+     "iopub.status.busy": "2023-10-04T01:15:49.770422Z",
+     "iopub.status.idle": "2023-10-04T01:15:49.777429Z",
+     "shell.execute_reply": "2023-10-04T01:15:49.776422Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# execute this cell without changing anything\n",
+    "count_more_common_1828_louisiana_katrina = count_more_common(\"1812 Louisiana hurricane\", \"Katrina\")\n",
+    "\n",
+    "count_more_common_1828_louisiana_katrina"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "91ce8c59",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q36\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e709c6a0",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Great work! You are now ready to start [P5](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/p5)"
+   ]
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.11.4"
+  },
+  "otter": {
+   "OK_FORMAT": true,
+   "tests": {
+    "q1": {
+     "name": "q1",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q1', name_idx0)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q10": {
+     "name": "q10",
+     "points": 2.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q10', dept_code)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q11": {
+     "name": "q11",
+     "points": 2.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q11', course_code)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q12": {
+     "name": "q12",
+     "points": 2,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q12', suffix)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q13": {
+     "name": "q13",
+     "points": 3,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q13', billions)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q14": {
+     "name": "q14",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q14', month_idx0)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q15": {
+     "name": "q15",
+     "points": 2.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q15', year)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q16": {
+     "name": "q16",
+     "points": 2.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q16', day)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q17": {
+     "name": "q17",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q17', day_formed_idx100)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q18": {
+     "name": "q18",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q18', year_formed_idx200)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q19": {
+     "name": "q19",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q19', month_diss_idx300)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q2": {
+     "name": "q2",
+     "points": 1,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q2', name_idx1)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q20": {
+     "name": "q20",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q20', sum_while)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q21": {
+     "name": "q21",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q21', sum_for)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q22": {
+     "name": "q22",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q22', total_deaths_first10)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q23": {
+     "name": "q23",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q23', average_wind_speed)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q24": {
+     "name": "q24",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q24', num_hurr_1000_deaths)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q25": {
+     "name": "q25",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q25', num_hurr_d)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q26": {
+     "name": "q26",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q26', fastest_name)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q27": {
+     "name": "q27",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q27', slowest_name)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q28": {
+     "name": "q28",
+     "points": 2.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q28', hurr_between_1980_2002)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q29": {
+     "name": "q29",
+     "points": 2.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q29', hurr_between_1901_2000)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q3": {
+     "name": "q3",
+     "points": 2,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q3', mph_idx7)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q30": {
+     "name": "q30",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q30', count_slower_than_200)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q31": {
+     "name": "q31",
+     "points": 0.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q31', count_number_of_harvey)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q32": {
+     "name": "q32",
+     "points": 0.5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q32', count_deadlier_than_0)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q33": {
+     "name": "q33",
+     "points": 1,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q33', find_average_mph_2022)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q34": {
+     "name": "q34",
+     "points": 1,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q34', check_more_than_one_maria)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q35": {
+     "name": "q35",
+     "points": 1,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q35', deadliest_hurr)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q36": {
+     "name": "q36",
+     "points": 1,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q36', count_more_common_1828_louisiana_katrina)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q4": {
+     "name": "q4",
+     "points": 2,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q4', damage_idx5)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q5": {
+     "name": "q5",
+     "points": 5,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q5', name_idx_last)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q6": {
+     "name": "q6",
+     "points": 1.25,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q6', last_digit)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q7": {
+     "name": "q7",
+     "points": 1.25,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q7', area_code)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q8": {
+     "name": "q8",
+     "points": 1.25,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q8', line_number)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q9": {
+     "name": "q9",
+     "points": 1.25,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q9', exchange_code)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    }
+   }
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/lab-p5/project.py b/lab-p5/project.py
new file mode 100644
index 0000000000000000000000000000000000000000..118e11b490d2d265e67c372322faf66c1345d358
--- /dev/null
+++ b/lab-p5/project.py
@@ -0,0 +1,49 @@
+__hurricane__ = []
+
+
+def __init__():
+    import csv
+    """This function will read in the csv_file and store it in a list of dictionaries"""
+    __hurricane__.clear()
+    with open('hurricanes.csv', mode='r') as csv_file:
+        csv_reader = csv.DictReader(csv_file)
+        for row in csv_reader:
+            __hurricane__.append(row)
+
+
+def count():
+    """This function will return the number of records in the dataset"""
+    return len(__hurricane__)
+
+
+def get_name(idx):
+    """get_name(idx) returns the name of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['name']
+
+
+def get_formed(idx):
+    """get_formed(idx) returns the date of formation of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['formed']
+
+
+def get_dissipated(idx):
+    """get_dissipated(idx) returns the date of dissipation of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['dissipated']
+
+
+def get_mph(idx):
+    """get_mph(idx) returns the mph of the hurricane in row idx"""
+    return int(__hurricane__[int(idx)]['mph'])
+
+
+def get_damage(idx):
+    """get_damage(idx) returns the damage in dollars of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['damage']
+
+
+def get_deaths(idx):
+    """get_deaths(idx) returns the deaths of the hurricane in row idx"""
+    return int(__hurricane__[int(idx)]['deaths'])
+
+
+__init__()
diff --git a/lab-p5/public_tests.py b/lab-p5/public_tests.py
new file mode 100644
index 0000000000000000000000000000000000000000..8070d014c9062a9f15fe22286cf726ee731c43d1
--- /dev/null
+++ b/lab-p5/public_tests.py
@@ -0,0 +1,840 @@
+#!/usr/bin/python
+# +
+import os, json, math, copy
+from collections import namedtuple
+from bs4 import BeautifulSoup
+
+HIDDEN_FILE = os.path.join("hidden", "hidden_tests.py")
+if os.path.exists(HIDDEN_FILE):
+    import hidden.hidden_tests as hidn
+# -
+
+MAX_FILE_SIZE = 750 # units - KB
+REL_TOL = 6e-04  # relative tolerance for floats
+ABS_TOL = 15e-03  # absolute tolerance for floats
+TOTAL_SCORE = 100 # total score for the project
+
+DF_FILE = 'expected_dfs.html'
+PLOT_FILE = 'expected_plots.json'
+
+PASS = "All test cases passed!"
+
+TEXT_FORMAT = "TEXT_FORMAT"  # question type when expected answer is a type, str, int, float, or bool
+TEXT_FORMAT_UNORDERED_LIST = "TEXT_FORMAT_UNORDERED_LIST"  # question type when the expected answer is a list or a set where the order does *not* matter
+TEXT_FORMAT_ORDERED_LIST = "TEXT_FORMAT_ORDERED_LIST"  # question type when the expected answer is a list or tuple where the order does matter
+TEXT_FORMAT_DICT = "TEXT_FORMAT_DICT"  # question type when the expected answer is a dictionary
+TEXT_FORMAT_SPECIAL_ORDERED_LIST = "TEXT_FORMAT_SPECIAL_ORDERED_LIST"  # question type when the expected answer is a list where order does matter, but with possible ties. Elements are ordered according to values in special_ordered_json (with ties allowed)
+TEXT_FORMAT_NAMEDTUPLE = "TEXT_FORMAT_NAMEDTUPLE"  # question type when expected answer is a namedtuple
+PNG_FORMAT_SCATTER = "PNG_FORMAT_SCATTER" # question type when the expected answer is a scatter plot
+HTML_FORMAT = "HTML_FORMAT" # question type when the expected answer is a DataFrame
+FILE_JSON_FORMAT = "FILE_JSON_FORMAT" # question type when the expected answer is a JSON file
+SLASHES = " SLASHES" # question SUFFIX when expected answer contains paths with slashes
+
+def get_expected_format():
+    """get_expected_format() returns a dict mapping each question to the format
+    of the expected answer."""
+    expected_format = {'q1': 'TEXT_FORMAT',
+                       'q2': 'TEXT_FORMAT',
+                       'q3': 'TEXT_FORMAT',
+                       'q4': 'TEXT_FORMAT',
+                       'q5': 'TEXT_FORMAT',
+                       'q6': 'TEXT_FORMAT',
+                       'q7': 'TEXT_FORMAT',
+                       'q8': 'TEXT_FORMAT',
+                       'q9': 'TEXT_FORMAT',
+                       'q10': 'TEXT_FORMAT',
+                       'q11': 'TEXT_FORMAT',
+                       'q12': 'TEXT_FORMAT',
+                       'q13': 'TEXT_FORMAT',
+                       'q14': 'TEXT_FORMAT',
+                       'q15': 'TEXT_FORMAT',
+                       'q16': 'TEXT_FORMAT',
+                       'q17': 'TEXT_FORMAT',
+                       'q18': 'TEXT_FORMAT',
+                       'q19': 'TEXT_FORMAT',
+                       'q20': 'TEXT_FORMAT',
+                       'q21': 'TEXT_FORMAT',
+                       'q22': 'TEXT_FORMAT',
+                       'q23': 'TEXT_FORMAT',
+                       'q24': 'TEXT_FORMAT',
+                       'q25': 'TEXT_FORMAT',
+                       'q26': 'TEXT_FORMAT',
+                       'q27': 'TEXT_FORMAT',
+                       'q28': 'TEXT_FORMAT',
+                       'q29': 'TEXT_FORMAT',
+                       'q30': 'TEXT_FORMAT',
+                       'q31': 'TEXT_FORMAT',
+                       'q32': 'TEXT_FORMAT',
+                       'q33': 'TEXT_FORMAT',
+                       'q34': 'TEXT_FORMAT',
+                       'q35': 'TEXT_FORMAT',
+                       'q36': 'TEXT_FORMAT'}
+    return expected_format
+
+
+def get_expected_json():
+    """get_expected_json() returns a dict mapping each question to the expected
+    answer (if the format permits it)."""
+    expected_json = {'q1': '1804 New England hurricane',
+                     'q2': '1806 Great Coastal hurricane',
+                     'q3': 105,
+                     'q4': '1M',
+                     'q5': 'Nicole',
+                     'q6': '9',
+                     'q7': '608',
+                     'q8': '5309',
+                     'q9': '867',
+                     'q10': 'CS',
+                     'q11': '220',
+                     'q12': 'B',
+                     'q13': 3.19,
+                     'q14': 10,
+                     'q15': 2023,
+                     'q16': 22,
+                     'q17': 13,
+                     'q18': 1979,
+                     'q19': 9,
+                     'q20': 325,
+                     'q21': 325,
+                     'q22': 1920,
+                     'q23': 99.53971119133574,
+                     'q24': 19,
+                     'q25': 41,
+                     'q26': 'Allen',
+                     'q27': '1975 Tropical Depression Six',
+                     'q28': 131,
+                     'q29': 295,
+                     'q30': 554,
+                     'q31': 3,
+                     'q32': 198,
+                     'q33': 97.77777777777777,
+                     'q34': True,
+                     'q35': 8000,
+                     'q36': 'Katrina'}
+    return expected_json
+
+
+def get_special_json():
+    """get_special_json() returns a dict mapping each question to the expected
+    answer stored in a special format as a list of tuples. Each tuple contains
+    the element expected in the list, and its corresponding value. Any two
+    elements with the same value can appear in any order in the actual list,
+    but if two elements have different values, then they must appear in the
+    same order as in the expected list of tuples."""
+    special_json = {}
+    return special_json
+
+
+def compare(expected, actual, q_format=TEXT_FORMAT):
+    """compare(expected, actual) is used to compare when the format of
+    the expected answer is known for certain."""
+    try:
+        if q_format == TEXT_FORMAT:
+            return simple_compare(expected, actual)
+        elif q_format == TEXT_FORMAT_UNORDERED_LIST:
+            return list_compare_unordered(expected, actual)
+        elif q_format == TEXT_FORMAT_ORDERED_LIST:
+            return list_compare_ordered(expected, actual)
+        elif q_format == TEXT_FORMAT_DICT:
+            return dict_compare(expected, actual)
+        elif q_format == TEXT_FORMAT_SPECIAL_ORDERED_LIST:
+            return list_compare_special(expected, actual)
+        elif q_format == TEXT_FORMAT_NAMEDTUPLE:
+            return namedtuple_compare(expected, actual)
+        elif q_format == PNG_FORMAT_SCATTER:
+            return compare_flip_dicts(expected, actual)
+        elif q_format == HTML_FORMAT:
+            return compare_cell_html(expected, actual)
+        elif q_format == FILE_JSON_FORMAT:
+            return compare_json(expected, actual)
+        else:
+            if expected != actual:
+                return "expected %s but found %s " % (repr(expected), repr(actual))
+    except:
+        if expected != actual:
+            return "expected %s" % (repr(expected))
+    return PASS
+
+
+def print_message(expected, actual, complete_msg=True):
+    """print_message(expected, actual) displays a simple error message."""
+    msg = "expected %s" % (repr(expected))
+    if complete_msg:
+        msg = msg + " but found %s" % (repr(actual))
+    return msg
+
+
+def simple_compare(expected, actual, complete_msg=True):
+    """simple_compare(expected, actual) is used to compare when the expected answer
+    is a type/Nones/str/int/float/bool. When the expected answer is a float,
+    the actual answer is allowed to be within the tolerance limit. Otherwise,
+    the values must match exactly, or a very simple error message is displayed."""
+    msg = PASS
+    if 'numpy' in repr(type((actual))):
+        actual = actual.item()
+    if isinstance(expected, type):
+        if expected != actual:
+            if isinstance(actual, type):
+                msg = "expected %s but found %s" % (expected.__name__, actual.__name__)
+            else:
+                msg = "expected %s but found %s" % (expected.__name__, repr(actual))
+    elif not isinstance(actual, type(expected)) and not (isinstance(expected, (float, int)) and isinstance(actual, (float, int))):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+    elif isinstance(expected, float):
+        if not math.isclose(actual, expected, rel_tol=REL_TOL, abs_tol=ABS_TOL):
+            msg = print_message(expected, actual, complete_msg)
+    elif isinstance(expected, (list, tuple)) or is_namedtuple(expected):
+        new_msg = print_message(expected, actual, complete_msg)
+        if len(expected) != len(actual):
+            return new_msg
+        for i in range(len(expected)):
+            val = simple_compare(expected[i], actual[i])
+            if val != PASS:
+                return new_msg
+    elif isinstance(expected, dict):
+        new_msg = print_message(expected, actual, complete_msg)
+        if len(expected) != len(actual):
+            return new_msg
+        val = simple_compare(list(expected.keys()), list(actual.keys()))
+        if val != PASS:
+            return new_msg
+        for key in expected:
+            val = simple_compare(expected[key], actual[key])
+            if val != PASS:
+                return new_msg
+    else:
+        if expected != actual:
+            msg = print_message(expected, actual, complete_msg)
+    return msg
+
+
+def intelligent_compare(expected, actual, obj=None):
+    """intelligent_compare(expected, actual) is used to compare when the
+    data type of the expected answer is not known for certain, and default
+    assumptions  need to be made."""
+    if obj == None:
+        obj = type(expected).__name__
+    if is_namedtuple(expected):
+        msg = namedtuple_compare(expected, actual)
+    elif isinstance(expected, (list, tuple)):
+        msg = list_compare_ordered(expected, actual, obj)
+    elif isinstance(expected, set):
+        msg = list_compare_unordered(expected, actual, obj)
+    elif isinstance(expected, (dict)):
+        msg = dict_compare(expected, actual)
+    else:
+        msg = simple_compare(expected, actual)
+    msg = msg.replace("CompDict", "dict").replace("CompSet", "set").replace("NewNone", "None")
+    return msg
+
+
+def is_namedtuple(obj, init_check=True):
+    """is_namedtuple(obj) returns True if `obj` is a namedtuple object
+    defined in the test file."""
+    bases = type(obj).__bases__
+    if len(bases) != 1 or bases[0] != tuple:
+        return False
+    fields = getattr(type(obj), '_fields', None)
+    if not isinstance(fields, tuple):
+        return False
+    if init_check and not type(obj).__name__ in [nt.__name__ for nt in _expected_namedtuples]:
+        return False
+    return True
+
+
+def list_compare_ordered(expected, actual, obj=None):
+    """list_compare_ordered(expected, actual) is used to compare when the
+    expected answer is a list/tuple, where the order of the elements matters."""
+    msg = PASS
+    if not isinstance(actual, type(expected)):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+        return msg
+    if obj == None:
+        obj = type(expected).__name__
+    for i in range(len(expected)):
+        if i >= len(actual):
+            msg = "at index %d of the %s, expected missing %s" % (i, obj, repr(expected[i]))
+            break
+        val = intelligent_compare(expected[i], actual[i], "sub" + obj)
+        if val != PASS:
+            msg = "at index %d of the %s, " % (i, obj) + val
+            break
+    if len(actual) > len(expected) and msg == PASS:
+        msg = "at index %d of the %s, found unexpected %s" % (len(expected), obj, repr(actual[len(expected)]))
+    if len(expected) != len(actual):
+        msg = msg + " (found %d entries in %s, but expected %d)" % (len(actual), obj, len(expected))
+
+    if len(expected) > 0:
+        try:
+            if msg != PASS and list_compare_unordered(expected, actual, obj) == PASS:
+                msg = msg + " (%s may not be ordered as required)" % (obj)
+        except:
+            pass
+    return msg
+
+
+def list_compare_helper(larger, smaller):
+    """list_compare_helper(larger, smaller) is a helper function which takes in
+    two lists of possibly unequal sizes and finds the item that is not present
+    in the smaller list, if there is such an element."""
+    msg = PASS
+    j = 0
+    for i in range(len(larger)):
+        if i == len(smaller):
+            msg = "expected %s" % (repr(larger[i]))
+            break
+        found = False
+        while not found:
+            if j == len(smaller):
+                val = simple_compare(larger[i], smaller[j - 1], complete_msg=False)
+                break
+            val = simple_compare(larger[i], smaller[j], complete_msg=False)
+            j += 1
+            if val == PASS:
+                found = True
+                break
+        if not found:
+            msg = val
+            break
+    return msg
+
+class NewNone():
+    """alternate class in place of None, which allows for comparison with
+    all other data types."""
+    def __str__(self):
+        return 'None'
+    def __repr__(self):
+        return 'None'
+    def __lt__(self, other):
+        return True
+    def __le__(self, other):
+        return True
+    def __gt__(self, other):
+        return False
+    def __ge__(self, other):
+        return other == None
+    def __eq__(self, other):
+        return other == None
+    def __ne__(self, other):
+        return other != None
+
+class CompDict(dict):
+    """subclass of dict, which allows for comparison with other dicts."""
+    def __init__(self, vals):
+        super(self.__class__, self).__init__(vals)
+        if type(vals) == CompDict:
+            self.val = vals.val
+        elif isinstance(vals, dict):
+            self.val = self.get_equiv(vals)
+        else:
+            raise TypeError("'%s' object cannot be type casted to CompDict class" % type(vals).__name__)
+
+    def get_equiv(self, vals):
+        val = []
+        for key in sorted(list(vals.keys())):
+            val.append((key, vals[key]))
+        return val
+
+    def __str__(self):
+        return str(dict(self.val))
+    def __repr__(self):
+        return repr(dict(self.val))
+    def __lt__(self, other):
+        return self.val < CompDict(other).val
+    def __le__(self, other):
+        return self.val <= CompDict(other).val
+    def __gt__(self, other):
+        return self.val > CompDict(other).val
+    def __ge__(self, other):
+        return self.val >= CompDict(other).val
+    def __eq__(self, other):
+        return self.val == CompDict(other).val
+    def __ne__(self, other):
+        return self.val != CompDict(other).val
+
+class CompSet(set):
+    """subclass of set, which allows for comparison with other sets."""
+    def __init__(self, vals):
+        super(self.__class__, self).__init__(vals)
+        if type(vals) == CompSet:
+            self.val = vals.val
+        elif isinstance(vals, set):
+            self.val = self.get_equiv(vals)
+        else:
+            raise TypeError("'%s' object cannot be type casted to CompSet class" % type(vals).__name__)
+
+    def get_equiv(self, vals):
+        return sorted(list(vals))
+
+    def __str__(self):
+        return str(set(self.val))
+    def __repr__(self):
+        return repr(set(self.val))
+    def __getitem__(self, index):
+        return self.val[index]
+    def __lt__(self, other):
+        return self.val < CompSet(other).val
+    def __le__(self, other):
+        return self.val <= CompSet(other).val
+    def __gt__(self, other):
+        return self.val > CompSet(other).val
+    def __ge__(self, other):
+        return self.val >= CompSet(other).val
+    def __eq__(self, other):
+        return self.val == CompSet(other).val
+    def __ne__(self, other):
+        return self.val != CompSet(other).val
+
+def make_sortable(item):
+    """make_sortable(item) replaces all Nones in `item` with an alternate
+    class that allows for comparison with str/int/float/bool/list/set/tuple/dict.
+    It also replaces all dicts (and sets) with a subclass that allows for
+    comparison with other dicts (and sets)."""
+    if item == None:
+        return NewNone()
+    elif isinstance(item, (type, str, int, float, bool)):
+        return item
+    elif isinstance(item, (list, set, tuple)):
+        new_item = []
+        for subitem in item:
+            new_item.append(make_sortable(subitem))
+        if is_namedtuple(item):
+            return type(item)(*new_item)
+        elif isinstance(item, set):
+            return CompSet(new_item)
+        else:
+            return type(item)(new_item)
+    elif isinstance(item, dict):
+        new_item = {}
+        for key in item:
+            new_item[key] = make_sortable(item[key])
+        return CompDict(new_item)
+    return item
+
+def list_compare_unordered(expected, actual, obj=None):
+    """list_compare_unordered(expected, actual) is used to compare when the
+    expected answer is a list/set where the order of the elements does not matter."""
+    msg = PASS
+    if not isinstance(actual, type(expected)):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+        return msg
+    if obj == None:
+        obj = type(expected).__name__
+
+    try:
+        sort_expected = sorted(make_sortable(expected))
+        sort_actual = sorted(make_sortable(actual))
+    except:
+        return "unexpected datatype found in %s; expected entries of type %s" % (obj, obj, type(expected[0]).__name__)
+
+    if len(actual) == 0 and len(expected) > 0:
+        msg = "in the %s, missing" % (obj) + sort_expected[0]
+    elif len(actual) > 0 and len(expected) > 0:
+        val = intelligent_compare(sort_expected[0], sort_actual[0])
+        if val.startswith("expected to find type"):
+            msg = "in the %s, " % (obj) + simple_compare(sort_expected[0], sort_actual[0])
+        else:
+            if len(expected) > len(actual):
+                msg = "in the %s, missing " % (obj) + list_compare_helper(sort_expected, sort_actual)
+            elif len(expected) < len(actual):
+                msg = "in the %s, found un" % (obj) + list_compare_helper(sort_actual, sort_expected)
+            if len(expected) != len(actual):
+                msg = msg + " (found %d entries in %s, but expected %d)" % (len(actual), obj, len(expected))
+                return msg
+            else:
+                val = list_compare_helper(sort_expected, sort_actual)
+                if val != PASS:
+                    msg = "in the %s, missing " % (obj) + val + ", but found un" + list_compare_helper(sort_actual,
+                                                                                               sort_expected)
+    return msg
+
+
+def namedtuple_compare(expected, actual):
+    """namedtuple_compare(expected, actual) is used to compare when the
+    expected answer is a namedtuple defined in the test file."""
+    msg = PASS
+    if is_namedtuple(actual, False):
+        msg = "expected namedtuple but found %s" % (type(actual).__name__)
+        return msg
+    if type(expected).__name__ != type(actual).__name__:
+        return "expected namedtuple %s but found namedtuple %s" % (type(expected).__name__, type(actual).__name__)
+    expected_fields = expected._fields
+    actual_fields = actual._fields
+    msg = list_compare_ordered(list(expected_fields), list(actual_fields), "namedtuple attributes")
+    if msg != PASS:
+        return msg
+    for field in expected_fields:
+        val = intelligent_compare(getattr(expected, field), getattr(actual, field))
+        if val != PASS:
+            msg = "at attribute %s of namedtuple %s, " % (field, type(expected).__name__) + val
+            return msg
+    return msg
+
+
+def clean_slashes(item):
+    """clean_slashes()"""
+    if isinstance(item, str):
+        return item.replace("\\", "/").replace("/", os.path.sep)
+    elif item == None or isinstance(item, (type, int, float, bool)):
+        return item
+    elif isinstance(item, (list, tuple, set)) or is_namedtuple(item):
+        new_item = []
+        for subitem in item:
+            new_item.append(clean_slashes(subitem))
+        if is_namedtuple(item):
+            return type(item)(*new_item)
+        else:
+            return type(item)(new_item)
+    elif isinstance(item, dict):
+        new_item = {}
+        for key in item:
+            new_item[clean_slashes(key)] = clean_slashes(item[key])
+        return item
+
+
+def list_compare_special_initialize(special_expected):
+    """list_compare_special_initialize(special_expected) takes in the special
+    ordering stored as a sorted list of items, and returns a list of lists
+    where the ordering among the inner lists does not matter."""
+    latest_val = None
+    clean_special = []
+    for row in special_expected:
+        if latest_val == None or row[1] != latest_val:
+            clean_special.append([])
+            latest_val = row[1]
+        clean_special[-1].append(row[0])
+    return clean_special
+
+
+def list_compare_special(special_expected, actual):
+    """list_compare_special(special_expected, actual) is used to compare when the
+    expected answer is a list with special ordering defined in `special_expected`."""
+    msg = PASS
+    expected_list = []
+    special_order = list_compare_special_initialize(special_expected)
+    for expected_item in special_order:
+        expected_list.extend(expected_item)
+    val = list_compare_unordered(expected_list, actual)
+    if val != PASS:
+        return val
+    i = 0
+    for expected_item in special_order:
+        j = len(expected_item)
+        actual_item = actual[i: i + j]
+        val = list_compare_unordered(expected_item, actual_item)
+        if val != PASS:
+            if j == 1:
+                msg = "at index %d " % (i) + val
+            else:
+                msg = "between indices %d and %d " % (i, i + j - 1) + val
+            msg = msg + " (list may not be ordered as required)"
+            break
+        i += j
+    return msg
+
+
+def dict_compare(expected, actual, obj=None):
+    """dict_compare(expected, actual) is used to compare when the expected answer
+    is a dict."""
+    msg = PASS
+    if not isinstance(actual, type(expected)):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+        return msg
+    if obj == None:
+        obj = type(expected).__name__
+
+    expected_keys = list(expected.keys())
+    actual_keys = list(actual.keys())
+    val = list_compare_unordered(expected_keys, actual_keys, obj)
+
+    if val != PASS:
+        msg = "bad keys in %s: " % (obj) + val
+    if msg == PASS:
+        for key in expected:
+            new_obj = None
+            if isinstance(expected[key], (list, tuple, set)):
+                new_obj = 'value'
+            elif isinstance(expected[key], dict):
+                new_obj = 'sub' + obj
+            val = intelligent_compare(expected[key], actual[key], new_obj)
+            if val != PASS:
+                msg = "incorrect value for key %s in %s: " % (repr(key), obj) + val
+    return msg
+
+
+def is_flippable(item):
+    """is_flippable(item) determines if the given dict of lists has lists of the
+    same length and is therefore flippable."""
+    item_lens = set(([str(len(item[key])) for key in item]))
+    if len(item_lens) == 1:
+        return PASS
+    else:
+        return "found lists of lengths %s" % (", ".join(list(item_lens)))
+
+def flip_dict_of_lists(item):
+    """flip_dict_of_lists(item) flips a dict of lists into a list of dicts if the
+    lists are of same length."""
+    new_item = []
+    length = len(list(item.values())[0])
+    for i in range(length):
+        new_dict = {}
+        for key in item:
+            new_dict[key] = item[key][i]
+        new_item.append(new_dict)
+    return new_item
+
+def compare_flip_dicts(expected, actual, obj="lists"):
+    """compare_flip_dicts(expected, actual) flips a dict of lists (or dicts) into
+    a list of dicts (or dict of dicts) and then compares the list ignoring order."""
+    msg = PASS
+    example_item = list(expected.values())[0]
+    if isinstance(example_item, (list, tuple)):
+        val = is_flippable(actual)
+        if val != PASS:
+            msg = "expected to find lists of length %d, but " % (len(example_item)) + val
+            return msg
+        msg = list_compare_unordered(flip_dict_of_lists(expected), flip_dict_of_lists(actual), "lists")
+    elif isinstance(example_item, dict):
+        expected_keys = list(example_item.keys())
+        for key in actual:
+            val = list_compare_unordered(expected_keys, list(actual[key].keys()), "dictionary %s" % key)
+            if val != PASS:
+                return val
+        for cat_key in expected_keys:
+            expected_category = {}
+            actual_category = {}
+            for key in expected:
+                expected_category[key] = expected[key][cat_key]
+                actual_category[key] = actual[key][cat_key]
+            val = list_compare_unordered(flip_dict_of_lists(expected), flip_dict_of_lists(actual), "category " + repr(cat_key))
+            if val != PASS:
+                return val
+    return msg
+
+
+def get_expected_tables():
+    """get_expected_tables() reads the html file with the expected DataFrames
+    and returns a dict mapping each question to a html table."""
+    if not os.path.exists(DF_FILE):
+        return None
+
+    expected_tables = {}
+    f = open(DF_FILE, encoding='utf-8')
+    soup = BeautifulSoup(f.read(), 'html.parser')
+    f.close()
+
+    tables = soup.find_all('table')
+    for table in tables:
+        expected_tables[table.get("data-question")] = table
+
+    return expected_tables
+
+def parse_df_html_table(table):
+    """parse_df_html_table(table) takes in a table as a html string and returns
+    a dict mapping each row and column index to the value at that position."""
+    rows = []
+    for tr in table.find_all('tr'):
+        rows.append([])
+        for cell in tr.find_all(['td', 'th']):
+            rows[-1].append(cell.get_text().strip("\n "))
+
+    cells = {}
+    for r in range(1, len(rows)):
+        for c in range(1, len(rows[0])):
+            rname = rows[r][0]
+            cname = rows[0][c]
+            cells[(rname,cname)] = rows[r][c]
+    return cells
+
+
+def get_expected_namedtuples():
+    """get_expected_namedtuples() defines the required namedtuple objects
+    globally. It also returns a tuple of the classes."""
+    expected_namedtuples = []
+    
+    return tuple(expected_namedtuples)
+
+_expected_namedtuples = get_expected_namedtuples()
+
+
+def compare_cell_html(expected, actual):
+    """compare_cell_html(expected, actual) is used to compare when the
+    expected answer is a DataFrame stored in the `expected_dfs` html file."""
+    expected_cells = parse_df_html_table(expected)
+    try:
+        actual_cells = parse_df_html_table(BeautifulSoup(actual, 'html.parser').find('table'))
+    except Exception as e:
+        return "expected to find type DataFrame but found type %s instead" % type(actual).__name__
+
+    expected_cols = list(set(["column %s" % (loc[1]) for loc in expected_cells]))
+    actual_cols = list(set(["column %s" % (loc[1]) for loc in actual_cells]))
+    msg = list_compare_unordered(expected_cols, actual_cols, "DataFrame")
+    if msg != PASS:
+        return msg
+
+    expected_rows = list(set(["row index %s" % (loc[0]) for loc in expected_cells]))
+    actual_rows = list(set(["row index %s" % (loc[0]) for loc in actual_cells]))
+    msg = list_compare_unordered(expected_rows, actual_rows, "DataFrame")
+    if msg != PASS:
+        return msg
+
+    for location, expected in expected_cells.items():
+        location_name = "column {} at index {}".format(location[1], location[0])
+        actual = actual_cells.get(location, None)
+        if actual == None:
+            return "in %s, expected to find %s" % (location_name, repr(expected))
+        try:
+            actual_ans = float(actual)
+            expected_ans = float(expected)
+            if math.isnan(actual_ans) and math.isnan(expected_ans):
+                continue
+        except Exception as e:
+            actual_ans, expected_ans = actual, expected
+        msg = simple_compare(expected_ans, actual_ans)
+        if msg != PASS:
+            return "in %s, " % location_name + msg
+    return PASS
+
+
+def get_expected_plots():
+    """get_expected_plots() reads the json file with the expected plot data
+    and returns a dict mapping each question to a dictionary with the plots data."""
+    if not os.path.exists(PLOT_FILE):
+        return None
+
+    f = open(PLOT_FILE, encoding='utf-8')
+    expected_plots = json.load(f)
+    f.close()
+    return expected_plots
+
+
+def compare_file_json(expected, actual):
+    """compare_file_json(expected, actual) is used to compare when the
+    expected answer is a JSON file."""
+    msg = PASS
+    if not os.path.isfile(expected):
+        return "file %s not found; make sure it is downloaded and stored in the correct directory" % (expected)
+    elif not os.path.isfile(actual):
+        return "file %s not found; make sure that you have created the file with the correct name" % (actual)
+    try:
+        e = open(expected, encoding='utf-8')
+        expected_data = json.load(e)
+        e.close()
+    except json.JSONDecodeError:
+        return "file %s is broken and cannot be parsed; please delete and redownload the file correctly" % (expected)
+    try:
+        a = open(actual, encoding='utf-8')
+        actual_data = json.load(a)
+        a.close()
+    except json.JSONDecodeError:
+        return "file %s is broken and cannot be parsed" % (actual)
+    if type(expected_data) == list:
+        msg = list_compare_ordered(expected_data, actual_data, 'file ' + actual)
+    elif type(expected_data) == dict:
+        msg = dict_compare(expected_data, actual_data)
+    return msg
+
+
+_expected_json = get_expected_json()
+_special_json = get_special_json()
+_expected_plots = get_expected_plots()
+_expected_tables = get_expected_tables()
+_expected_format = get_expected_format()
+
+def check(qnum, actual):
+    """check(qnum, actual) is used to check if the answer in the notebook is
+    the correct answer, and provide useful feedback if the answer is incorrect."""
+    msg = PASS
+    error_msg = "<b style='color: red;'>ERROR:</b> "
+    q_format = _expected_format[qnum]
+
+    if q_format == TEXT_FORMAT_SPECIAL_ORDERED_LIST:
+        expected = _special_json[qnum]
+    elif q_format == PNG_FORMAT_SCATTER:
+        if _expected_plots == None:
+            msg = error_msg + "file %s not parsed; make sure it is downloaded and stored in the correct directory" % (PLOT_FILE)
+        else:
+            expected = _expected_plots[qnum]
+    elif q_format == HTML_FORMAT:
+        if _expected_tables == None:
+            msg = error_msg + "file %s not parsed; make sure it is downloaded and stored in the correct directory" % (DF_FILE)
+        else:
+            expected = _expected_tables[qnum]
+    else:
+        expected = _expected_json[qnum]
+
+    if SLASHES in q_format:
+        q_format = q_format.replace(SLASHES, "")
+        expected = clean_slashes(expected)
+        actual = clean_slashes(actual)
+
+    if msg != PASS:
+        print(msg)
+    else:
+        msg = compare(expected, actual, q_format)
+        if msg != PASS:
+            msg = error_msg + msg
+        print(msg)
+
+
+def check_file_size(path):
+    """check_file_size(path) throws an error if the file is too big to display
+    on Gradescope."""
+    size = os.path.getsize(path)
+    assert size < MAX_FILE_SIZE * 10**3, "Your file is too big to be displayed by Gradescope; please delete unnecessary output cells so your file size is < %s KB" % MAX_FILE_SIZE
+
+
+def reset_hidden_tests():
+    """reset_hidden_tests() resets all hidden tests on the Gradescope autograder where the hidden test file exists"""
+    if not os.path.exists(HIDDEN_FILE):
+        return
+    hidn.reset_hidden_tests()
+
+def rubric_check(rubric_point, ignore_past_errors=True):
+    """rubric_check(rubric_point) uses the hidden test file on the Gradescope autograder to grade the `rubric_point`"""
+    if not os.path.exists(HIDDEN_FILE):
+        print(PASS)
+        return
+    error_msg_1 = "ERROR: "
+    error_msg_2 = "TEST DETAILS: "
+    try:
+        msg = hidn.rubric_check(rubric_point, ignore_past_errors)
+    except:
+        msg = "hidden tests crashed before execution"
+    if msg != PASS:
+        hidn.make_deductions(rubric_point)
+        if msg == "public tests failed":
+            comment = "The public tests have failed, so you will not receive any points for this question."
+            comment += "\nPlease confirm that the public tests pass locally before submitting."
+        elif msg == "answer is hardcoded":
+            comment = "In the datasets for testing hardcoding, all numbers are replaced with random values."
+            comment += "\nIf the answer is the same as in the original dataset for all these datasets"
+            comment += "\ndespite this, that implies that the answer in the notebook is hardcoded."
+            comment += "\nYou will not receive any points for this question."
+        else:
+            comment = hidn.get_comment(rubric_point)
+        msg = error_msg_1 + msg
+        if comment != "":
+            msg = msg + "\n" + error_msg_2 + comment
+    print(msg)
+
+def get_summary():
+    """get_summary() returns the summary of the notebook using the hidden test file on the Gradescope autograder"""
+    if not os.path.exists(HIDDEN_FILE):
+        print("Total Score: %d/%d" % (TOTAL_SCORE, TOTAL_SCORE))
+        return
+    score = min(TOTAL_SCORE, hidn.get_score(TOTAL_SCORE))
+    display_msg = "Total Score: %d/%d" % (score, TOTAL_SCORE)
+    if score != TOTAL_SCORE:
+        display_msg += "\n" + hidn.get_deduction_string()
+    print(display_msg)
+
+def get_score_digit(digit):
+    """get_score_digit(digit) returns the `digit` of the score using the hidden test file on the Gradescope autograder"""
+    if not os.path.exists(HIDDEN_FILE):
+        score = TOTAL_SCORE
+    else:
+        score = hidn.get_score(TOTAL_SCORE)
+    digits = bin(score)[2:]
+    digits = "0"*(7 - len(digits)) + digits
+    return int(digits[6 - digit])
diff --git a/p5/README.md b/p5/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..8cf8c2f989e964ffdbad0df93e4e335e439c02fe
--- /dev/null
+++ b/p5/README.md
@@ -0,0 +1,48 @@
+# Project 5 (P5): Investigating Hurricane Data
+
+
+## Corrections and clarifications
+
+* None yet.
+
+**Find any issues?** Report to us:
+
+- Takis Chytas <chytas@wisc.edu>
+- Samuel Guo <sguo258@wisc.edu>
+
+## Note on Academic Misconduct:
+You are **allowed** to work with a partner on your projects. While it is not required that you work with a partner, it is **recommended** that you find a project partner as soon as possible as the projects will get progressively harder. Be careful **not** to work with more than one partner. If you worked with a partner on Lab-P5, you are **not** allowed to finish your project with a different partner. You may either continue to work with the same partner, or work on P5 alone. Now may be a good time to review our [course policies](https://cs220.cs.wisc.edu/f23/syllabus.html).
+
+## Instructions:
+
+This project will focus on **loops** and **strings**. To start, download `p5.ipynb`, `project.py`, `public_tests.py` and `hurricanes.csv`.
+
+**Note:** Please go through [Lab-P5](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/lab-p5) before you start the project. The lab contains some very important information that will be necessary for you to finish the project.
+
+You will work on `p5.ipynb` and hand it in. You should follow the provided directions for each question. Questions have **specific** directions on what **to do** and what **not to do**.
+
+After you've downloaded the file to your `p5` directory, open a terminal window and use `cd` to navigate to that directory. To make sure you're in the correct directory in the terminal, type `pwd`. To make sure you've downloaded the notebook file, type `ls` to ensure that `p5.ipynb`, `project.py`, `public_tests.py`, and `hurricanes.csv` are listed. Then run the command `jupyter notebook` to start Jupyter, and get started on the project!
+
+**IMPORTANT**: You should **NOT** terminate/close the session where you run the above command. If you need to use any other Terminal/PowerShell commands, open a new window instead. Keep constantly saving your notebook file, by either clicking the "Save and Checkpoint" button (floppy disk) or using the appropriate keyboard shortcut.
+
+------------------------------
+
+## IMPORTANT Submission instructions:
+- Review the [Grading Rubric](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/p5/rubric.md), to ensure that you don't lose points during code review.
+- Login to [Gradescope](https://www.gradescope.com/) and upload the zip file into the P5 assignment.
+- If you completed the project with a **partner**, make sure to **add their name** by clicking "Add Group Member"
+in Gradescope when uploading the P5 zip file.
+
+   <img src="images/add_group_member.png" width="400">
+
+   **Warning:** You will have to add your partner on Gradescope even if you have filled out this information in your `p5.ipynb` notebook.
+
+- It is **your responsibility** to make sure that your project clears auto-grader tests on the Gradescope test system. Otter test results should be available within forty minutes after your submission (usually within ten minutes). **Ignore** the `-/100.00` that is displayed to the right. You should be able to see both PASS / FAIL results for the 20 test cases, which is accessible via Gradescope Dashboard (as in the image below):
+
+    <img src="images/gradescope.png" width="400">
+
+- You can view your **final score** at the **end of the page**. If you pass all tests, then you will receive **full points** for the project. Otherwise, you can see your final score in the **summary** section of the test results (as in the image below):
+
+   <img src="images/summary.png" width="400">
+
+   If you want more details on why you lost points on a particular test, you can scroll up to find more details about the test.
diff --git a/p5/gen_csv.ipynb b/p5/gen_csv.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..20a934d9551ed792fad94838d4f37ee214a22508
--- /dev/null
+++ b/p5/gen_csv.ipynb
@@ -0,0 +1,1180 @@
+{
+ "cells": [
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "id": "0093917e",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "import pandas as pd\n",
+    "import requests\n",
+    "from bs4 import BeautifulSoup as BS\n",
+    "import csv\n",
+    "import datetime"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "id": "23fa7e95",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "2023"
+      ]
+     },
+     "execution_count": 2,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "FINAL_YEAR_IN_DATASET = int(datetime.datetime.now().date().strftime(\"%Y\"))\n",
+    "\n",
+    "FINAL_YEAR_IN_DATASET"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 3,
+   "id": "83d6a097",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "179"
+      ]
+     },
+     "execution_count": 3,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "year_urls = {}\n",
+    "for year in range(1800, 1850, 10):\n",
+    "    year_urls[year] = \"https://en.wikipedia.org/wiki/%ss_Atlantic_hurricane_seasons\" % (str(year))\n",
+    "for year in range(1850, FINAL_YEAR_IN_DATASET+1):\n",
+    "    year_urls[year] = \"https://en.wikipedia.org/wiki/%s_Atlantic_hurricane_season\" % (str(year))\n",
+    "    \n",
+    "len(year_urls)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 4,
+   "id": "810e4019",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "179"
+      ]
+     },
+     "execution_count": 4,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "year_pages = {}\n",
+    "for year in year_urls:\n",
+    "    r = requests.get(year_urls[year])\n",
+    "    page = BS(r.text, \"html.parser\")\n",
+    "    year_pages[year] = page\n",
+    "len(year_pages)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 5,
+   "id": "db752dde",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "627"
+      ]
+     },
+     "execution_count": 5,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "hurricane_urls = []\n",
+    "for year in year_pages:\n",
+    "    page = year_pages[year]\n",
+    "    for url in page.find_all(\"div\", {\"class\": \"hatnote navigation-not-searchable\"}):\n",
+    "        if 'main article' in url.get_text().lower():\n",
+    "            hurr_url = \"https://en.wikipedia.org\" + url.find('a')['href']\n",
+    "            if hurr_url not in hurricane_urls:\n",
+    "                hurricane_urls.append(hurr_url)\n",
+    "                \n",
+    "len(hurricane_urls)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 6,
+   "id": "13e67b9c",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "['https://en.wikipedia.org/wiki/1804_Antigua%E2%80%93Charleston_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1804_Snow_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1806_Great_Coastal_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1812_Louisiana_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1815_North_Carolina_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/Great_September_Gale_of_1815',\n",
+       " 'https://en.wikipedia.org/wiki/1821_Norfolk_and_Long_Island_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1826_Canary_Islands_storm',\n",
+       " 'https://en.wikipedia.org/wiki/1827_North_Carolina_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/Great_Barbados_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/Racer%27s_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1842_Atlantic_hurricane_season',\n",
+       " 'https://en.wikipedia.org/wiki/Great_Havana_Hurricane_of_1846',\n",
+       " 'https://en.wikipedia.org/wiki/1848_Tampa_Bay_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1856_Last_Island_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1867_San_Narciso_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1869_Saxby_Gale',\n",
+       " 'https://en.wikipedia.org/wiki/1875_Indianola_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1876_San_Felipe_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/Gale_of_1878',\n",
+       " 'https://en.wikipedia.org/wiki/1886_Indianola_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1887_Halloween_tropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/1888_Louisiana_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1891_Martinique_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1893_San_Roque_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1893_New_York_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1893_Sea_Islands_Hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1893_Cheniere_Caminada_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1894_Greater_Antilles_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1896_Cedar_Keys_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1896_East_Coast_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1898_Windward_Islands_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1898_Georgia_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1899_Carrabelle_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1899_San_Ciriaco_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1900_Galveston_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1901_Louisiana_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1903_Jamaica_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1903_Florida_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1903_New_Jersey_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1906_Mississippi_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1906_Florida_Keys_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1909_Velasco_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1909_Monterrey_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1909_Grand_Isle_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1909_Florida_Keys_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1909_Greater_Antilles_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1910_Cuba_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1912_Jamaica_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1915_Galveston_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1915_New_Orleans_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1916_Gulf_Coast_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1916_Charleston_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1916_Texas_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1916_Virgin_Islands_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1916_Pensacola_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1917_Nueva_Gerona_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1919_Florida_Keys_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1920_Louisiana_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/September_1921_San_Antonio_floods',\n",
+       " 'https://en.wikipedia.org/wiki/1921_Tampa_Bay_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1924_Cuba_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1925_Florida_tropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/1926_Nassau_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1926_Nova_Scotia_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1926_Louisiana_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1926_Miami_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1926_Havana%E2%80%93Bermuda_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1927_Nova_Scotia_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1928_Fort_Pierce_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1928_Haiti_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1928_Okeechobee_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1929_Bahamas_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1930_Dominican_Republic_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1931_British_Honduras_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1932_Freeport_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1932_Florida%E2%80%93Alabama_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1932_Bahamas_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1932_San_Cipri%C3%A1n_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1932_Cuba_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Trinidad_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Texas_tropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Florida%E2%80%93Mexico_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Chesapeake%E2%80%93Potomac_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Cuba%E2%80%93Brownsville_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Treasure_Coast_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Outer_Banks_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Tampico_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1933_Cuba%E2%80%93Bahamas_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1934_Central_America_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1935_Labor_Day_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1935_Cuba_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1935_J%C3%A9r%C3%A9mie_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1935_Yankee_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1936_Mid-Atlantic_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1938_New_England_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1940_Louisiana_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1940_South_Carolina_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1940_New_England_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1940_Nova_Scotia_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1941_Texas_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1941_Florida_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1942_Matagorda_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1942_Belize_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1943_Surprise_Hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1944_Jamaica_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1944_Great_Atlantic_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1944_Cuba%E2%80%93Florida_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1945_Outer_Banks_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1945_Texas_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1945_Homestead_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1946_Florida_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1947_Fort_Lauderdale_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1947_Cape_Sable_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1948_Bermuda%E2%80%93Newfoundland_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/September_1948_Florida_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1948_Miami_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1949_Florida_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1949_Texas_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Able_(1950)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Baker_(1950)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dog_(1950)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Easy_(1950)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_King',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_cyclone_naming',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Able_(1951)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Charlie_(1951)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_How',\n",
+       " 'https://en.wikipedia.org/wiki/1952_Groundhog_Day_tropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Able_(1952)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fox_(1952)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Alice_(1953)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Barbara_(1953)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Carol_(1953)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Florence_(1953)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alice_(June_1954)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Carol',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Edna',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hazel',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alice_(December_1954)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Connie',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Diane',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ione',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hilda_(1955)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Janet',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Betsy_(1956)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Flossy_(1956)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Greta_(1956)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Audrey',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Carrie',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ella_(1958)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Helene_(1958)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arlene_(1959)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Cindy_(1959)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Debra_(1959)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gracie',\n",
+       " 'https://en.wikipedia.org/wiki/1960_Texas_tropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Abby_(1960)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Brenda_(1960)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Donna',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ethel_(1960)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Anna_(1961)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Carla',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Debbie_(1961)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Esther',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hattie',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alma_(1962)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Daisy_(1962)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Arlene_(1963)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Cindy_(1963)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Edith_(1963)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Flora',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ginny',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Abby_(1964)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Cleo',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dora',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gladys_(1964)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hilda',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Isbell',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Betsy',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Debbie_(1965)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alma_(1966)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Faith',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Inez',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Beulah',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Doria',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Abby_(1968)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Candy',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gladys_(1968)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Camille',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Debbie_(1969)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Francelia',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gerda',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Inga',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Martha',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alma_(1970)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Becky_(1970)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Celia',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Dorothy_(1970)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ella_(1970)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Felice',\n",
+       " 'https://en.wikipedia.org/wiki/1970_Caribbean%E2%80%93Azores_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/1970_Canada_hurricane',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Beth',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Doria',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fern',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Edith_(1971)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ginger',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Irene%E2%80%93Olivia',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Laura_(1971)',\n",
+       " 'https://en.wikipedia.org/wiki/Subtropical_Storm_Alpha_(1972)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Agnes',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Carrie_(1972)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Brenda_(1973)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Christine_(1973)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Delia_(1973)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fran_(1973)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Gilda_(1973)',\n",
+       " 'https://en.wikipedia.org/wiki/Subtropical_Storm_One_(1974)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Alma_(1974)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Carmen',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fifi%E2%80%93Orlene',\n",
+       " 'https://en.wikipedia.org/wiki/Subtropical_Storm_Four_(1974)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Amy_(1975)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_Six_(1975)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Caroline',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Eloise',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gladys_(1975)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Belle',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Dottie',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Emmy',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Anita',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Babe',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Amelia_(1978)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Cora',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Debra_(1978)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ella_(1978)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Greta%E2%80%93Olivia',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_One_(1979)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bob_(1979)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Claudette_(1979)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_David',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Frederic',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Elena_(1979)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Henri_(1979)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Allen',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Danielle_(1980)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Hermine_(1980)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Jeanne_(1980)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Karl_(1980)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arlene_(1981)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bret_(1981)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dennis_(1981)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_Eight_(1981)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Katrina_(1981)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alberto_(1982)',\n",
+       " 'https://en.wikipedia.org/wiki/1982_Florida_subtropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Beryl_(1982)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Chris_(1982)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Debby_(1982)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alicia',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Barry_(1983)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Dean_(1983)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Diana_(1984)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Fran_(1984)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Isidore_(1984)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Josephine_(1984)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Klaus_(1984)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Lili_(1984)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bob_(1985)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Danny_(1985)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Elena',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gloria',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Juan_(1985)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Kate_(1985)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bonnie_(1986)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Charley_(1986)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Danielle_(1986)',\n",
+       " 'https://en.wikipedia.org/wiki/1987_Gulf_Coast_tropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Arlene_(1987)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Emily_(1987)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Floyd_(1987)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_Fourteen_(1987)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_One_(1988)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Beryl_(1988)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Chris_(1988)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Debby_(1988)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Florence_(1988)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gilbert',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Joan%E2%80%93Miriam',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Keith_(1988)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Allison_(1989)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Chantal_(1989)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dean_(1989)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gabrielle_(1989)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hugo',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Jerry_(1989)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bertha_(1990)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Diana',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gustav_(1990)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Klaus_(1990)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Lili_(1990)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Marco_(1990)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bob',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Grace_(1991)',\n",
+       " 'https://en.wikipedia.org/wiki/1991_Perfect_Storm',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_One_(1992)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Andrew',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bonnie_(1992)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Danielle_(1992)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_One_(1993)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arlene_(1993)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bret_(1993)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Cindy_(1993)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Emily_(1993)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gert',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Alberto_(1994)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Beryl_(1994)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Debby_(1994)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Florence_(1994)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gordon',\n",
+       " 'https://en.wikipedia.org/wiki/Christmas_1994_nor%27easter',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Allison_(1995)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Dean_(1995)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Erin_(1995)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Felix_(1995)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Gabrielle_(1995)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Iris_(1995)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Jerry_(1995)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Luis',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Marilyn',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Opal',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Roxanne',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Tanya',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arthur_(1996)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bertha_(1996)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Cesar%E2%80%93Douglas',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dolly_(1996)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Edouard_(1996)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fran',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hortense',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Josephine_(1996)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Lili_(1996)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Marco_(1996)',\n",
+       " 'https://en.wikipedia.org/wiki/1996_Lake_Huron_cyclone',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Danny_(1997)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Erika_(1997)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bonnie_(1998)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Charley_(1998)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Danielle_(1998)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Earl_(1998)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Frances_(1998)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Georges',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Hermine_(1998)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Mitch',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bret',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dennis_(1999)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Floyd',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gert_(1999)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Harvey_(1999)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Irene_(1999)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Jose_(1999)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Katrina_(1999)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Lenny',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alberto_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Beryl_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Debby_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Florence_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gordon_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Helene_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Isaac_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Joyce_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Keith',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Leslie_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Michael_(2000)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Allison',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Barry_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Chantal_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Dean_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Erin_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gabrielle_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Humberto_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Iris',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Jerry_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Karen_(2001)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Michelle',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Olga',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arthur_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bertha_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Cristobal_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Edouard_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Fay_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gustav_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Hanna_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Isidore',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Kyle_(2002)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Lili',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Ana_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bill_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Claudette_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Erika_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fabian',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Grace_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Henri_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Isabel',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Juan',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Kate_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Larry_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Nicholas_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Odette_(2003)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alex_(2004)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bonnie_(2004)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Charley',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Earl_(2004)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Frances',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gaston_(2004)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ivan',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Jeanne',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Karl_(2004)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Matthew_(2004)',\n",
+       " 'https://en.wikipedia.org/wiki/Subtropical_Storm_Nicole',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arlene_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bret_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Cindy_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dennis',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Emily_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Gert_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Irene_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Jose_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Katrina',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Maria_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nate_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ophelia_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Rita',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Stan',\n",
+       " 'https://en.wikipedia.org/wiki/2005_Azores_subtropical_storm',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Tammy',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Vince',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Wilma',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Alpha_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Beta',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Gamma_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Delta_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Epsilon_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Zeta_(2005)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Alberto_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Beryl_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Chris_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Debby_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ernesto_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Florence_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gordon_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Helene_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Isaac_(2006)',\n",
+       " 'https://en.wikipedia.org/wiki/Subtropical_Storm_Andrea_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Barry_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Chantal_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dean',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Erin_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Felix',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Gabrielle_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Humberto_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_Ten_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Lorenzo_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Noel',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Olga_(2007)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arthur_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bertha_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Cristobal_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dolly_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Edouard_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Fay_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gustav',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hanna_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ike',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Kyle_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Laura_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Marco_(2008)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Omar',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Paloma',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Ana_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bill_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Claudette_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Danny_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Erika_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fred_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Grace_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ida_(2009)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alex_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_Two_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bonnie_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Colin_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Depression_Five_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Earl_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Hermine_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Igor',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Julia_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Karl',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Matthew_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Nicole_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Otto_(2010)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Paula',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Richard',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Shary',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Tomas',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arlene_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bret_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Don_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Emily_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Harvey_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Irene',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Katia_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Lee_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Maria_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nate_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ophelia_(2011)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Rina',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Beryl_(2012)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Debby_(2012)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ernesto_(2012)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Helene_(2012)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Isaac_(2012)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Leslie_(2012)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nadine',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Rafael',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Sandy',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Andrea_(2013)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Barry_(2013)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Chantal_(2013)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Fernand_(2013)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ingrid',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Karen_(2013)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Arthur',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bertha_(2014)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Cristobal',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Dolly_(2014)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fay',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gonzalo',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Ana_(2015)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bill_(2015)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Danny_(2015)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Erika',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fred_(2015)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Joaquin',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Kate_(2015)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Alex_(2016)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bonnie_(2016)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Colin_(2016)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Earl_(2016)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hermine',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Julia_(2016)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Matthew',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nicole_(2016)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Otto',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bret_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Cindy_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Emily_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Franklin_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gert_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Harvey',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Irma',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Jose_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Katia_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Maria',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nate',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ophelia_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Philippe_(2017)',\n",
+       " 'https://en.wikipedia.org/wiki/Potential_Tropical_Cyclone_Ten',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Alberto_(2018)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Beryl',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Chris_(2018)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Florence',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Gordon_(2018)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Kirk_(2018)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Leslie_(2018)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Michael',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Barry_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Dorian',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Fernand_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Humberto_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Imelda',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Karen_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Lorenzo_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Melissa_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Nestor_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Pablo',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Olga_(2019)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Arthur_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Bertha_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_storms_Amanda_and_Cristobal',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Fay_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Hanna_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Isaias',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Laura',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Marco_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nana_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Paulette',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Sally',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Teddy',\n",
+       " 'https://en.wikipedia.org/wiki/Subtropical_Storm_Alpha_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Beta_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Gamma',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Delta',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Epsilon_(2020)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Zeta',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Eta',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Iota',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Claudette_(2021)',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Danny_(2021)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Elsa',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Fred_(2021)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Grace',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Henri',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ida',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Larry',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Mindy',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nicholas',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Sam',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Alex_(2022)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Bonnie_(2022)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Danielle_(2022)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Earl_(2022)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Fiona',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Ian',\n",
+       " 'https://en.wikipedia.org/wiki/Tropical_Storm_Hermine_(2022)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Julia_(2022)',\n",
+       " 'https://en.wikipedia.org/wiki/Hurricane_Nicole_(2022)']"
+      ]
+     },
+     "execution_count": 6,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "hurricane_urls"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 7,
+   "id": "ee7a9025",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "pages = {}\n",
+    "for url in hurricane_urls:\n",
+    "    r = requests.get(url)\n",
+    "    pages[url] = BS(r.text, \"html.parser\")"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 8,
+   "id": "119d1259",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "627"
+      ]
+     },
+     "execution_count": 8,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "hurr_data = {}\n",
+    "for url in pages:\n",
+    "    page = pages[url]\n",
+    "    title_full = page.find('title').get_text()\n",
+    "    title = title_full.split(\"-\")[0].strip()\n",
+    "    table = page.find('table')\n",
+    "    fields = {}\n",
+    "    for tr in page.find_all(\"tr\"):\n",
+    "        tds = tr.find_all([\"td\", \"th\"])\n",
+    "        tds = [td.get_text().strip().lower() for td in tds]\n",
+    "        if len(tds) != 2:\n",
+    "            continue\n",
+    "        fields[tds[0].replace(\"\\xa0\", \" \")] = tds[1].strip().lower().replace(\",\", \"\").replace(\"\\xa0\", \" \")\n",
+    "    hurr_data[title] = fields\n",
+    "\n",
+    "len(hurr_data)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 9,
+   "id": "013820db",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def clean_name(hurr, formed):\n",
+    "    name = hurr.replace('Tropical Storm', 'Hurricane').replace('Tropical storms', 'Hurricane').split('Hurricane')[-1].strip()\n",
+    "    if name == '':\n",
+    "        name = hurr\n",
+    "    name = name.split('(')[0].strip()\n",
+    "    for hurr_type in ['subtropical storm', 'tropical depression', 'tropical cyclone', 'potential tropical cyclone']:\n",
+    "        if name.lower().startswith(hurr_type):\n",
+    "            name = formed[-4:] + ' ' + name\n",
+    "    return name"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 10,
+   "id": "8d2f12be",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def clean_date(date):\n",
+    "    date = date.split('(')[0].replace('  ', ' ').strip()\n",
+    "    if len(date.split()) != 3:\n",
+    "        return None\n",
+    "    month_list = ['january', 'february', 'march', 'april', 'may', 'june', 'july', 'august', 'september', 'october', 'november', 'december']\n",
+    "    months = {}\n",
+    "    for month in range(len(month_list)):\n",
+    "        mm = str(month+1)\n",
+    "        if len(mm) < 2:\n",
+    "            mm = '0' + mm\n",
+    "        months[month_list[month]] = mm\n",
+    "    month, dd, yyyy = date.split()\n",
+    "    if dd in month_list:\n",
+    "        month, dd = dd, month\n",
+    "    mm = months[month]\n",
+    "    if len(dd) < 2:\n",
+    "            dd = '0' + dd\n",
+    "    return '%s/%s/%s' % (mm[:2], dd[:2], yyyy[:4])"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 11,
+   "id": "b5556480",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def clean_damage(damage):\n",
+    "    if damage in ['unknown', 'millions', 'moderate']:\n",
+    "        return None\n",
+    "    elif damage in ['minimal', 'none', 'none reported', 'unspecified']:\n",
+    "        return '0'\n",
+    "    damage = damage.split(\"$\")[-1].split('–')[-1].split(\" \")\n",
+    "    try:\n",
+    "        num = float(damage[0])\n",
+    "    except:\n",
+    "        return None\n",
+    "    if int(num) == num:\n",
+    "        num = int(num)\n",
+    "    num = str(num)\n",
+    "    if len(damage) > 1 and damage[1] == \"million\":\n",
+    "        final_damage = num + \"M\"\n",
+    "    elif len(damage) > 1 and damage[1] == \"billion\":\n",
+    "        final_damage = num + \"B\"\n",
+    "    elif str(damage[0])[-3:] == \"000\":\n",
+    "        final_damage = num[:-3] + \"K\"\n",
+    "    else:\n",
+    "        final_damage = num\n",
+    "    return final_damage"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 12,
+   "id": "62b7ff3f",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def clean_deaths(deaths):\n",
+    "    if 'no ' in deaths or 'none' in deaths:\n",
+    "        return '0'\n",
+    "    deaths = deaths.replace('≥', '').replace('at least', '').replace('up to', '').replace('over', '')\n",
+    "    deaths = deaths.replace('c.', '').replace('+', '').replace('>', '').replace('~', '')\n",
+    "    if '–' in deaths:\n",
+    "        deaths = deaths.split('–')[1]\n",
+    "    elif '-' in deaths:\n",
+    "        deaths = deaths.split('-')[1]\n",
+    "    deaths = deaths.replace('indirect', 'total').replace('direct', 'total').replace('all', 'total')\n",
+    "    deaths = deaths.replace('reported', 'total').replace('related', 'total').replace('confirmed', 'total')\n",
+    "    deaths = deaths.replace('deaths', 'total').replace('dead', 'total').replace('overall', 'total')\n",
+    "    deaths = deaths.split('[')[0].split()[0]\n",
+    "    deaths = deaths.split('total')[0].strip('( ')\n",
+    "    try:\n",
+    "        return str(int(deaths))\n",
+    "    except:\n",
+    "        return None"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 13,
+   "id": "a280140b",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "554"
+      ]
+     },
+     "execution_count": 13,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "full_data_hurr = []\n",
+    "for hurr in hurr_data:\n",
+    "    bad_hurr = False\n",
+    "    fields = list(hurr_data[hurr].keys())\n",
+    "    for key in ['formed', 'dissipated', 'highest winds', 'fatalities', 'damage']:\n",
+    "        if key not in fields:\n",
+    "            bad_hurr = True\n",
+    "    for key in fields:\n",
+    "        if key not in ['formed', 'dissipated', 'highest winds', 'fatalities', 'damage']:\n",
+    "            hurr_data[hurr].pop(key)\n",
+    "    if bad_hurr == False:\n",
+    "        formed = clean_date(hurr_data[hurr]['formed'])\n",
+    "        dissipated = clean_date(hurr_data[hurr]['dissipated'])\n",
+    "        mph = int(hurr_data[hurr]['highest winds'].split(\":\")[-1].split('mph')[0].split('(')[-1].strip())\n",
+    "        damage = clean_damage(hurr_data[hurr]['damage'])\n",
+    "        deaths = clean_deaths(hurr_data[hurr]['fatalities'])\n",
+    "        if formed != None and dissipated != None and damage != None and deaths != None:\n",
+    "            final_hurr_data = {}\n",
+    "            final_hurr_data['name'] = clean_name(hurr, formed)\n",
+    "            final_hurr_data['formed'] = formed\n",
+    "            final_hurr_data['dissipated'] = dissipated\n",
+    "            final_hurr_data['mph'] = mph\n",
+    "            final_hurr_data['damage'] = damage\n",
+    "            final_hurr_data['deaths'] = deaths\n",
+    "            full_data_hurr.append(final_hurr_data)\n",
+    "        \n",
+    "len(full_data_hurr)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 14,
+   "id": "7374ea3b",
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Open 'hurricanes.csv' to find the extracted data\n"
+     ]
+    },
+    {
+     "data": {
+      "text/html": [
+       "<div>\n",
+       "<style scoped>\n",
+       "    .dataframe tbody tr th:only-of-type {\n",
+       "        vertical-align: middle;\n",
+       "    }\n",
+       "\n",
+       "    .dataframe tbody tr th {\n",
+       "        vertical-align: top;\n",
+       "    }\n",
+       "\n",
+       "    .dataframe thead th {\n",
+       "        text-align: right;\n",
+       "    }\n",
+       "</style>\n",
+       "<table border=\"1\" class=\"dataframe\">\n",
+       "  <thead>\n",
+       "    <tr style=\"text-align: right;\">\n",
+       "      <th></th>\n",
+       "      <th>name</th>\n",
+       "      <th>formed</th>\n",
+       "      <th>dissipated</th>\n",
+       "      <th>mph</th>\n",
+       "      <th>damage</th>\n",
+       "      <th>deaths</th>\n",
+       "    </tr>\n",
+       "  </thead>\n",
+       "  <tbody>\n",
+       "    <tr>\n",
+       "      <th>0</th>\n",
+       "      <td>1804 New England hurricane</td>\n",
+       "      <td>10/04/1804</td>\n",
+       "      <td>10/11/1804</td>\n",
+       "      <td>110</td>\n",
+       "      <td>100K</td>\n",
+       "      <td>16</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>1</th>\n",
+       "      <td>1806 Great Coastal hurricane</td>\n",
+       "      <td>08/17/1806</td>\n",
+       "      <td>08/25/1806</td>\n",
+       "      <td>110</td>\n",
+       "      <td>171K</td>\n",
+       "      <td>24</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>2</th>\n",
+       "      <td>1812 Louisiana hurricane</td>\n",
+       "      <td>08/15/1812</td>\n",
+       "      <td>08/20/1812</td>\n",
+       "      <td>115</td>\n",
+       "      <td>6M</td>\n",
+       "      <td>100</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>3</th>\n",
+       "      <td>1821 Norfolk and Long Island hurricane</td>\n",
+       "      <td>09/01/1821</td>\n",
+       "      <td>09/04/1821</td>\n",
+       "      <td>130</td>\n",
+       "      <td>200K</td>\n",
+       "      <td>22</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>4</th>\n",
+       "      <td>1848 Tampa Bay hurricane</td>\n",
+       "      <td>09/23/1848</td>\n",
+       "      <td>09/28/1848</td>\n",
+       "      <td>130</td>\n",
+       "      <td>20K</td>\n",
+       "      <td>0</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>...</th>\n",
+       "      <td>...</td>\n",
+       "      <td>...</td>\n",
+       "      <td>...</td>\n",
+       "      <td>...</td>\n",
+       "      <td>...</td>\n",
+       "      <td>...</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>549</th>\n",
+       "      <td>Fiona</td>\n",
+       "      <td>09/14/2022</td>\n",
+       "      <td>09/27/2022</td>\n",
+       "      <td>140</td>\n",
+       "      <td>3.09B</td>\n",
+       "      <td>29</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>550</th>\n",
+       "      <td>Ian</td>\n",
+       "      <td>09/23/2022</td>\n",
+       "      <td>10/01/2022</td>\n",
+       "      <td>160</td>\n",
+       "      <td>113B</td>\n",
+       "      <td>161</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>551</th>\n",
+       "      <td>Hermine</td>\n",
+       "      <td>09/23/2022</td>\n",
+       "      <td>09/26/2022</td>\n",
+       "      <td>40</td>\n",
+       "      <td>9.8M</td>\n",
+       "      <td>0</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>552</th>\n",
+       "      <td>Julia</td>\n",
+       "      <td>10/07/2022</td>\n",
+       "      <td>10/10/2022</td>\n",
+       "      <td>85</td>\n",
+       "      <td>406M</td>\n",
+       "      <td>35</td>\n",
+       "    </tr>\n",
+       "    <tr>\n",
+       "      <th>553</th>\n",
+       "      <td>Nicole</td>\n",
+       "      <td>11/07/2022</td>\n",
+       "      <td>11/11/2022</td>\n",
+       "      <td>75</td>\n",
+       "      <td>1B</td>\n",
+       "      <td>11</td>\n",
+       "    </tr>\n",
+       "  </tbody>\n",
+       "</table>\n",
+       "<p>554 rows × 6 columns</p>\n",
+       "</div>"
+      ],
+      "text/plain": [
+       "                                       name      formed  dissipated  mph  \\\n",
+       "0                1804 New England hurricane  10/04/1804  10/11/1804  110   \n",
+       "1              1806 Great Coastal hurricane  08/17/1806  08/25/1806  110   \n",
+       "2                  1812 Louisiana hurricane  08/15/1812  08/20/1812  115   \n",
+       "3    1821 Norfolk and Long Island hurricane  09/01/1821  09/04/1821  130   \n",
+       "4                  1848 Tampa Bay hurricane  09/23/1848  09/28/1848  130   \n",
+       "..                                      ...         ...         ...  ...   \n",
+       "549                                   Fiona  09/14/2022  09/27/2022  140   \n",
+       "550                                     Ian  09/23/2022  10/01/2022  160   \n",
+       "551                                 Hermine  09/23/2022  09/26/2022   40   \n",
+       "552                                   Julia  10/07/2022  10/10/2022   85   \n",
+       "553                                  Nicole  11/07/2022  11/11/2022   75   \n",
+       "\n",
+       "    damage deaths  \n",
+       "0     100K     16  \n",
+       "1     171K     24  \n",
+       "2       6M    100  \n",
+       "3     200K     22  \n",
+       "4      20K      0  \n",
+       "..     ...    ...  \n",
+       "549  3.09B     29  \n",
+       "550   113B    161  \n",
+       "551   9.8M      0  \n",
+       "552   406M     35  \n",
+       "553     1B     11  \n",
+       "\n",
+       "[554 rows x 6 columns]"
+      ]
+     },
+     "execution_count": 14,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "df = pd.DataFrame(full_data_hurr)\n",
+    "df.to_csv(\"hurricanes.csv\", index=False)\n",
+    "print(\"Open 'hurricanes.csv' to find the extracted data\")\n",
+    "df"
+   ]
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.9.13"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/p5/hurricanes.csv b/p5/hurricanes.csv
new file mode 100644
index 0000000000000000000000000000000000000000..fb683b4ce7f15962bd6e5b326b4c4004d2f6c86a
--- /dev/null
+++ b/p5/hurricanes.csv
@@ -0,0 +1,555 @@
+name,formed,dissipated,mph,damage,deaths
+1804 New England hurricane,10/04/1804,10/11/1804,110,100K,16
+1806 Great Coastal hurricane,08/17/1806,08/25/1806,110,171K,24
+1812 Louisiana hurricane,08/15/1812,08/20/1812,115,6M,100
+1821 Norfolk and Long Island hurricane,09/01/1821,09/04/1821,130,200K,22
+1848 Tampa Bay hurricane,09/23/1848,09/28/1848,130,20K,0
+1867 San Narciso hurricane,10/27/1867,10/31/1867,125,1M,811
+1875 Indianola hurricane,09/08/1875,09/18/1875,115,4M,800
+Gale of 1878,10/18/1878,10/23/1878,105,2M,71
+1886 Indianola hurricane,08/12/1886,08/21/1886,150,200K,74
+1887 Halloween tropical storm,10/29/1887,11/06/1887,70,7K,2
+1891 Martinique hurricane,08/18/1891,08/25/1891,125,10M,700
+1893 Sea Islands hurricane,08/15/1893,09/02/1893,120,1M,2000
+1893 Cheniere Caminada hurricane,09/27/1893,10/05/1893,130,5M,2000
+1894 Greater Antilles hurricane,09/18/1894,10/01/1894,120,5.09M,227
+1896 Cedar Keys hurricane,09/22/1896,09/30/1896,125,338M,202
+1896 East Coast hurricane,10/07/1896,10/13/1896,100,500K,4
+1898 Windward Islands hurricane,09/05/1898,09/19/1898,110,2.5M,392
+1898 Georgia hurricane,09/25/1898,10/06/1898,130,1.5M,179
+1899 Carrabelle hurricane,07/28/1899,08/02/1899,100,1M,9
+1899 San Ciriaco hurricane,08/03/1899,09/12/1899,150,20M,3855
+1900 Galveston hurricane,08/27/1900,09/15/1900,145,1.25B,8000
+1901 Louisiana hurricane,08/02/1901,08/18/1901,90,1M,15
+1903 Jamaica hurricane,08/06/1903,08/16/1903,120,10M,188
+1903 Florida hurricane,09/09/1903,09/16/1903,90,500K,14
+1903 New Jersey hurricane,09/12/1903,09/17/1903,100,8M,57
+1906 Mississippi hurricane,09/19/1906,09/29/1906,120,19221,134
+1906 Florida Keys hurricane,10/08/1906,10/23/1906,120,4.14M,240
+1909 Velasco hurricane,07/13/1909,07/22/1909,115,2M,41
+1909 Monterrey hurricane,08/20/1909,08/28/1909,120,50M,4000
+1909 Grand Isle hurricane,09/13/1909,09/22/1909,120,11M,400
+1909 Florida Keys hurricane,10/06/1909,10/13/1909,120,3M,34
+1909 Greater Antilles hurricane,11/08/1909,11/14/1909,105,10M,198
+1910 Cuba hurricane,10/09/1910,10/23/1910,150,1.25M,116
+1912 Jamaica hurricane,11/11/1912,11/22/1912,115,1.5M,105
+1915 Galveston hurricane,08/05/1915,08/23/1915,145,30M,405
+1915 New Orleans hurricane,09/21/1915,10/01/1915,145,13M,279
+1916 Gulf Coast hurricane,06/28/1916,07/10/1916,120,12.5M,34
+1916 Charleston hurricane,07/11/1916,07/15/1916,115,22M,84
+1916 Texas hurricane,08/12/1916,08/20/1916,130,11.8M,37
+1916 Virgin Islands hurricane,10/06/1916,10/15/1916,120,2M,41
+1916 Pensacola hurricane,10/09/1916,10/19/1916,110,100K,29
+1917 Nueva Gerona hurricane,09/20/1917,09/30/1917,150,2.17M,44
+1919 Florida Keys hurricane,09/02/1919,09/16/1919,150,22M,772
+1920 Louisiana hurricane,09/16/1920,09/23/1920,100,1.45M,1
+1921 Tampa Bay hurricane,10/20/1921,10/30/1921,140,10M,8
+1925 Florida tropical storm,11/27/1925,12/01/1925,65,3M,73
+1926 Nassau hurricane,07/22/1926,08/02/1926,140,7.85M,287
+1926 Louisiana hurricane,08/20/1926,08/27/1926,115,6M,25
+1926 Miami hurricane,09/11/1926,09/22/1926,150,100M,539
+1927 Nova Scotia hurricane,08/18/1927,08/29/1927,125,1.6M,192
+1928 Fort Pierce hurricane,08/03/1928,08/14/1928,105,235K,2
+1928 Haiti hurricane,08/07/1928,08/17/1928,90,2M,210
+1928 Okeechobee hurricane,09/06/1928,09/21/1928,160,1.7B,4112
+1929 Bahamas hurricane,09/22/1929,10/04/1929,155,9.31M,155
+1930 San Zenón hurricane,08/29/1930,09/17/1930,155,50M,8000
+1931 British Honduras hurricane,09/06/1931,09/13/1931,130,7.5M,2500
+1932 Freeport hurricane,08/12/1932,08/15/1932,150,7.5M,40
+1932 Florida–Alabama hurricane,08/26/1932,09/04/1932,85,229K,1
+1932 San Ciprián hurricane,09/25/1932,10/02/1932,145,35.8M,272
+1932 Cuba hurricane,10/30/1932,11/14/1932,175,40M,3103
+1933 Trinidad hurricane,06/24/1933,07/08/1933,110,7.2M,35
+1933 Florida–Mexico hurricane,07/24/1933,08/05/1933,90,67.8M,39
+1933 Chesapeake–Potomac hurricane,08/13/1933,08/28/1933,140,41.2M,47
+1933 Cuba–Brownsville hurricane,08/22/1933,09/05/1933,160,27.9M,179
+1933 Treasure Coast hurricane,08/31/1933,09/07/1933,140,3M,3
+1933 Outer Banks hurricane,09/08/1933,09/22/1933,140,4.75M,24
+1933 Tampico hurricane,09/16/1933,09/25/1933,160,5M,184
+1933 Cuba–Bahamas hurricane,10/01/1933,10/09/1933,125,1.1M,10
+1934 Central America hurricane,06/04/1934,06/21/1934,100,9.46M,506
+1935 Labor Day hurricane,08/29/1935,09/10/1935,185,100M,423
+1935 Cuba hurricane,09/23/1935,10/02/1935,140,14.5M,52
+1935 Jérémie hurricane,10/18/1935,10/27/1935,85,16M,2150
+1935 Yankee hurricane,10/30/1935,11/08/1935,105,5.5M,19
+1936 Mid,09/08/1936,09/25/1936,120,4.05M,2
+1938 New England hurricane,09/09/1938,09/23/1938,160,306M,682
+1940 Louisiana hurricane,08/03/1940,08/10/1940,100,10.8M,7
+1940 South Carolina hurricane,08/05/1940,08/15/1940,100,13M,50
+1940 New England hurricane,08/26/1940,09/03/1940,110,4.05M,7
+1941 Texas hurricane,09/16/1941,09/27/1941,125,7.5M,7
+1941 Florida hurricane,10/03/1941,10/13/1941,120,675K,10
+1942 Matagorda hurricane,08/21/1942,08/31/1942,115,26.5M,8
+1942 Belize hurricane,11/05/1942,11/11/1942,110,4M,9
+1943 Surprise Hurricane,07/25/1943,07/29/1943,105,17M,19
+1944 Great Atlantic hurricane,09/09/1944,09/16/1944,160,100M,400
+1944 Cuba–Florida hurricane,10/12/1944,10/24/1944,145,100M,318
+1945 Outer Banks hurricane,06/20/1945,07/04/1945,100,75K,1
+1945 Texas hurricane,08/24/1945,08/29/1945,115,20.1M,3
+1945 Homestead hurricane,09/12/1945,09/20/1945,130,60M,26
+1946 Florida hurricane,10/05/1946,10/14/1946,100,5.2M,5
+1947 Fort Lauderdale hurricane,09/04/1947,09/20/1947,145,110M,51
+1947 Florida–Georgia hurricane,10/09/1947,10/16/1947,105,42.7M,1
+1948 Bermuda–Newfoundland hurricane,09/04/1948,09/16/1948,130,400K,8
+September 1948 Florida hurricane,09/18/1948,09/26/1948,130,14M,13
+1948 Miami hurricane,10/03/1948,10/16/1948,125,12.5M,11
+1949 Florida hurricane,08/23/1949,08/31/1949,130,52M,2
+1949 Texas hurricane,09/27/1949,10/07/1949,110,6.7M,2
+Able,08/12/1950,08/24/1950,125,1.04M,11
+Baker,08/18/1950,09/01/1950,105,2.55M,38
+Dog,08/30/1950,09/18/1950,145,3M,31
+Easy,09/01/1950,09/09/1950,120,3.3M,2
+King,10/13/1950,10/20/1950,130,32M,11
+Able,05/15/1951,05/24/1951,90,0,0
+Charlie,08/12/1951,08/23/1951,130,75M,259
+How,09/28/1951,10/08/1951,100,2M,17
+1952 Groundhog Day tropical storm,02/03/1952,02/05/1952,70,0,0
+Able,08/18/1952,09/02/1952,100,2.75M,3
+Fox,10/20/1952,10/28/1952,145,10M,601
+Barbara,08/11/1953,08/16/1953,90,1.3M,9
+Carol,08/28/1953,09/08/1953,160,2M,5
+Florence,09/23/1953,09/26/1953,115,200K,0
+Alice,06/24/1954,06/26/1954,110,2M,153
+Carol,08/25/1954,09/01/1954,115,462M,72
+Edna,09/02/1954,09/15/1954,125,42.8M,29
+Hazel,10/05/1954,10/18/1954,130,382M,1191
+Alice,12/30/1954,01/06/1955,90,623K,0
+Connie,08/03/1955,08/15/1955,140,86M,77
+Diane,08/07/1955,08/23/1955,105,832M,184
+Ione,09/10/1955,09/21/1955,140,88M,7
+Hilda,09/10/1955,09/20/1955,120,120M,304
+Janet,09/21/1955,09/30/1955,175,65.8M,1023
+Betsy,08/09/1956,08/18/1956,120,50M,37
+Flossy,09/20/1956,10/03/1956,90,24.9M,15
+Greta,10/30/1956,11/06/1956,100,3.6M,1
+Audrey,06/25/1957,06/29/1957,125,150M,431
+Ella,08/30/1958,09/06/1958,110,200K,36
+Helene,09/21/1958,10/04/1958,150,11.4M,1
+Arlene,05/28/1959,05/31/1959,65,500K,1
+Cindy,07/05/1959,07/11/1959,75,75K,6
+Debra,07/23/1959,07/28/1959,85,7M,0
+Gracie,09/20/1959,10/02/1959,140,14M,22
+1960 Texas tropical storm,06/22/1960,06/29/1960,60,3.6M,18
+Abby,07/10/1960,07/16/1960,80,640K,6
+Brenda,07/28/1960,08/01/1960,70,5M,1
+Donna,08/29/1960,09/14/1960,145,980M,439
+Ethel,09/12/1960,09/17/1960,115,1.5M,1
+Anna,07/20/1961,07/24/1961,105,300K,1
+Carla,09/03/1961,09/17/1961,145,326M,43
+Debbie,09/06/1961,09/19/1961,90,50M,68
+Esther,09/10/1961,09/27/1961,160,6M,7
+Hattie,10/27/1961,11/01/1961,165,60.3M,319
+Alma,08/26/1962,08/30/1962,85,1M,1
+Daisy,09/29/1962,10/08/1962,105,1.1M,32
+Arlene,07/31/1963,08/11/1963,115,300K,0
+Cindy,09/16/1963,09/20/1963,65,12.5M,3
+Edith,09/23/1963,09/29/1963,100,46.6M,10
+Ginny,10/16/1963,10/29/1963,110,500K,3
+Abby,08/05/1964,08/08/1964,70,750K,0
+Cleo,08/21/1964,09/05/1964,150,187M,156
+Dora,08/28/1964,09/14/1964,130,280M,5
+Gladys,09/13/1964,09/24/1964,130,100K,1
+Hilda,09/28/1964,10/05/1964,140,126M,38
+Isbell,10/08/1964,10/19/1964,115,30M,7
+Betsy,08/27/1965,09/13/1965,140,1.42B,81
+Debbie,09/24/1965,09/30/1965,60,25M,0
+Alma,06/04/1966,06/14/1966,125,210M,93
+Inez,09/21/1966,10/11/1966,165,227M,1269
+Beulah,09/05/1967,09/22/1967,160,235M,59
+Doria,09/08/1967,09/21/1967,100,150K,3
+Abby,06/01/1968,06/13/1968,75,450K,6
+Gladys,10/13/1968,10/21/1968,100,18.7M,8
+Camille,08/14/1969,08/22/1969,175,1.42B,259
+Francelia,08/29/1969,09/04/1969,100,35.6M,271
+Martha,11/21/1969,11/25/1969,90,30M,5
+Becky,07/19/1970,07/23/1970,65,500K,1
+Celia,07/31/1970,08/05/1970,140,930M,28
+Dorothy,08/17/1970,08/23/1970,70,34M,51
+Felice,09/12/1970,09/17/1970,70,0,0
+1970 Caribbean–Azores hurricane,09/30/1970,10/22/1970,85,65.5M,22
+1970 Canada hurricane,10/12/1970,10/20/1970,105,1K,0
+Beth,08/10/1971,08/16/1971,85,5.1M,1
+Doria,08/20/1971,08/29/1971,65,148M,7
+Fern,09/03/1971,09/13/1971,90,30.2M,2
+Edith,09/05/1971,09/18/1971,160,25.4M,37
+Ginger,09/10/1971,10/07/1971,110,10M,1
+Irene–Olivia,09/11/1971,10/01/1971,115,1M,3
+Laura,11/12/1971,11/22/1971,70,0,1
+1972 Subtropical Storm Alpha,05/23/1972,05/29/1972,70,100K,2
+Agnes,06/14/1972,07/06/1972,85,2.1B,128
+Carrie,08/29/1972,09/05/1972,70,12.45M,4
+Delia,09/01/1973,09/07/1973,70,6M,2
+Fran,10/08/1973,10/12/1973,80,0,0
+1974 Subtropical Storm One,06/22/1974,06/27/1974,65,10M,3
+Alma,08/12/1974,08/15/1974,65,5M,51
+Carmen,08/29/1974,09/10/1974,150,162M,8
+1974 Subtropical Storm Four,10/04/1974,10/08/1974,50,600K,0
+Amy,06/27/1975,07/04/1975,70,0,1
+1975 Tropical Depression Six,07/28/1975,08/01/1975,35,8.8M,3
+Eloise,09/13/1975,09/24/1975,125,560M,80
+Belle,08/06/1976,08/15/1976,120,100M,3
+Anita,08/29/1977,09/04/1977,175,946M,11
+Babe,09/03/1977,09/09/1977,75,13M,0
+Amelia,07/30/1978,08/01/1978,50,110M,33
+Cora,08/07/1978,08/12/1978,90,0,1
+Debra,08/26/1978,08/29/1978,60,0,2
+Ella,08/30/1978,09/05/1978,140,0,0
+Greta–Olivia,09/13/1978,09/23/1978,130,26M,5
+1979 Tropical Depression One,06/11/1979,06/16/1979,35,27M,41
+Bob,07/09/1979,07/16/1979,75,20M,1
+Claudette,07/16/1979,07/29/1979,50,400M,2
+David,08/25/1979,09/08/1979,175,1.54B,2078
+Frederic,08/29/1979,09/15/1979,130,1.77B,12
+Elena,08/29/1979,09/02/1979,40,10M,2
+Henri,09/14/1979,09/24/1979,85,0,0
+Allen,07/31/1980,08/11/1980,190,1.57B,269
+Danielle,09/04/1980,09/07/1980,60,25M,3
+Jeanne,11/07/1980,11/16/1980,100,0,0
+Karl,11/25/1980,11/29/1980,85,0,0
+Arlene,05/06/1981,05/09/1981,60,0,0
+Dennis,08/07/1981,08/26/1981,80,28.5M,3
+1981 Tropical Depression Eight,08/26/1981,08/19/1981,35,56.2M,5
+Katrina,11/03/1982,11/08/1981,85,0,2
+Alberto,06/01/1982,06/06/1982,85,85M,23
+1982 Florida subtropical storm,06/18/1982,06/22/1982,70,10M,3
+Beryl,08/28/1982,09/06/1982,70,3M,3
+Chris,09/09/1982,09/13/1982,65,2M,0
+Alicia,08/15/1983,08/21/1983,115,3B,21
+Barry,08/23/1983,08/29/1983,80,0,0
+Diana,09/08/1984,09/16/1984,130,65.5M,3
+Fran,09/15/1984,09/20/1984,65,2.8M,32
+Isidore,09/25/1984,10/01/1984,60,1M,1
+Klaus,11/05/1984,11/16/1984,90,152M,2
+Lili,12/12/1984,12/24/1984,80,0,0
+Bob,07/21/1985,07/26/1985,75,20M,5
+Danny,08/12/1985,08/20/1985,90,100M,5
+Elena,08/28/1985,09/04/1985,125,1.3B,9
+Gloria,09/16/1985,10/04/1985,145,900M,14
+Juan,10/26/1985,11/03/1985,85,1.5B,12
+Kate,11/15/1985,11/23/1985,120,700M,15
+Bonnie,06/23/1986,06/28/1986,85,42M,5
+Charley,08/15/1986,08/30/1986,80,15M,15
+Danielle,09/07/1986,09/10/1986,60,10.5M,0
+Arlene,08/10/1987,08/23/1987,75,8K,0
+Emily,09/20/1987,09/26/1987,125,80.3M,3
+Floyd,10/09/1987,10/13/1987,75,500K,1
+1987 Tropical Depression Fourteen,10/31/1987,11/04/1987,35,1.8M,6
+Beryl,08/08/1988,08/10/1988,50,3M,1
+Chris,08/21/1988,08/30/1988,50,2.2M,6
+Florence,09/07/1988,09/11/1988,80,2.9M,1
+Gilbert,09/08/1988,09/19/1988,185,2.98B,318
+Joan–Miriam,10/10/1988,11/02/1988,70,2B,334
+Keith,11/17/1988,11/26/1988,70,7.3M,0
+Allison,06/24/1989,07/07/1989,50,560M,11
+Chantal,07/30/1989,08/03/1989,80,100M,13
+Dean,07/31/1989,08/08/1989,105,8.9M,0
+Gabrielle,08/30/1989,09/13/1989,145,0,9
+Hugo,09/10/1989,09/25/1989,160,11B,67
+Jerry,10/12/1989,10/16/1989,85,70M,3
+Bertha,07/24/1990,08/02/1990,80,3.91M,9
+Diana,08/04/1990,08/09/1990,100,90.7M,139
+Gustav,08/24/1990,09/03/1990,120,0,0
+Klaus,10/03/1990,10/09/1990,80,1M,11
+Marco,10/09/1990,10/13/1990,65,57M,12
+Bob,08/16/1991,08/29/1991,115,1.5B,17
+Grace,10/25/1991,10/30/1991,105,0,0
+1991 Perfect Storm,10/28/1991,11/02/1991,75,200M,13
+1992 Tropical Depression One,06/25/1992,06/26/1992,35,2.6M,4
+Andrew,08/16/1992,08/29/1992,175,27.3B,65
+Bonnie,09/17/1992,09/30/1992,110,0,1
+Arlene,06/18/1993,06/21/1993,40,60.8M,26
+Bret,08/04/1993,08/11/1993,60,35.7M,213
+Cindy,08/14/1993,08/17/1993,45,19M,4
+Emily,08/22/1993,09/06/1993,115,35M,3
+Gert,09/14/1993,09/26/1993,100,170M,116
+Alberto,06/30/1994,07/07/1994,65,1.03B,32
+Beryl,08/14/1994,08/19/1994,60,74.2M,5
+Debby,09/09/1994,09/11/1994,70,115M,9
+Florence,11/02/1994,11/08/1994,110,0,0
+Gordon,11/08/1994,11/21/1994,85,594M,1152
+Allison,06/03/1995,06/11/1995,75,1.7M,1
+Dean,07/28/1995,08/02/1995,45,500K,1
+Erin,07/31/1995,08/06/1995,100,700M,16
+Felix,08/08/1995,08/25/1995,140,3.63M,9
+Gabrielle,08/09/1995,08/12/1995,70,0,6
+Jerry,08/22/1995,08/28/1995,40,40M,6
+Luis,08/28/1995,09/12/1995,150,3.3B,19
+Marilyn,09/12/1995,09/30/1995,115,2.5B,13
+Opal,09/27/1995,10/06/1995,150,4.7B,63
+Roxanne,10/07/1995,10/21/1995,115,1.5B,29
+Tanya,10/27/1995,11/03/1995,85,0,1
+Arthur,06/17/1996,06/24/1996,45,1M,0
+Bertha,07/05/1996,07/18/1996,115,335M,12
+Cesar–Douglas,07/24/1996,08/06/1996,130,203M,113
+Edouard,08/19/1996,09/07/1996,145,20M,2
+Fran,08/23/1996,09/10/1996,120,5B,22
+Hortense,09/03/1996,09/16/1996,140,158M,39
+Josephine,10/04/1996,10/13/1996,70,130M,3
+Lili,10/14/1996,10/30/1996,115,662M,22
+Marco,11/16/1996,11/26/1996,75,8.2M,15
+Danny,07/16/1997,07/27/1997,80,100M,9
+Erika,09/03/1997,09/20/1997,125,10M,2
+Bonnie,08/19/1998,08/30/1998,115,1B,5
+Charley,08/21/1998,08/24/1998,70,50M,20
+Danielle,08/24/1998,09/08/1998,105,50K,0
+Earl,08/31/1998,09/08/1998,100,79M,3
+Frances,09/08/1998,09/13/1998,65,500M,1
+Georges,09/15/1998,10/01/1998,155,9.37B,604
+Hermine,09/17/1998,09/20/1998,45,85K,2
+Bret,08/18/1999,08/25/1999,145,15M,1
+Dennis,08/24/1999,09/09/1999,105,157M,6
+Floyd,09/07/1999,09/19/1999,155,6.5B,85
+Gert,09/11/1999,09/23/1999,150,1.9M,2
+Harvey,09/19/1999,09/22/1999,60,22.6M,0
+Irene,10/13/1999,10/24/1999,110,800M,3
+Jose,10/17/1999,10/25/1999,100,5M,3
+Katrina,10/28/1999,11/01/1999,40,9K,0
+Lenny,11/13/1999,11/23/1999,155,786M,17
+Alberto,08/03/2000,08/25/2000,125,0,0
+Beryl,08/13/2000,08/15/2000,50,27K,1
+Debby,08/19/2000,08/24/2000,85,735K,1
+Florence,09/10/2000,09/19/2000,80,0,3
+Gordon,09/14/2000,09/21/2000,80,10.8M,24
+Helene,09/15/2000,09/25/2000,70,16M,1
+Isaac,09/21/2000,10/04/2000,140,0,1
+Joyce,09/25/2000,10/02/2000,90,0,0
+Keith,09/28/2000,10/06/2000,140,319M,62
+Leslie,10/04/2000,10/12/2000,45,950M,3
+Allison,06/05/2001,06/20/2001,60,9B,41
+Barry,08/02/2001,08/08/2001,70,30M,2
+Chantal,08/14/2001,08/22/2001,70,4M,2
+Dean,08/22/2001,08/28/2001,70,7.7M,0
+Erin,09/01/2001,09/17/2001,120,0,0
+Gabrielle,09/11/2001,09/19/2001,80,230M,2
+Iris,10/04/2001,10/09/2001,145,250M,36
+Karen,10/12/2001,10/15/2001,80,1.4M,0
+Michelle,10/29/2001,11/06/2001,140,2.43B,48
+Arthur,07/14/2002,07/19/2002,60,0,1
+Bertha,08/04/2002,08/09/2002,40,200K,1
+Cristobal,08/05/2002,08/13/2002,50,0,3
+Fay,09/05/2002,09/11/2002,60,4.5M,0
+Gustav,09/08/2002,09/15/2002,100,340K,4
+Hanna,09/12/2002,09/15/2002,60,20M,3
+Isidore,09/14/2002,09/27/2002,125,1.28B,19
+Kyle,09/20/2002,10/14/2002,85,5M,1
+Lili,09/21/2002,10/04/2002,145,1.16B,15
+Ana,04/20/2003,04/27/2003,60,0,2
+Bill,06/29/2003,07/03/2003,60,50.5M,4
+Claudette,07/08/2003,07/17/2003,90,181M,3
+Erika,08/14/2003,08/20/2003,75,100K,2
+Fabian,08/27/2003,09/10/2003,145,300M,8
+Grace,08/30/2003,09/02/2003,40,113K,0
+Henri,09/03/2003,09/08/2003,60,19.6M,0
+Isabel,09/06/2003,09/20/2003,165,3.6B,51
+Juan,09/24/2003,09/29/2003,105,200M,8
+Kate,09/25/2003,10/10/2003,125,0,0
+Larry,10/01/2003,10/06/2003,65,53.6M,5
+Nicholas,10/13/2003,11/05/203,70,0,0
+Odette,12/04/2003,12/09/2003,65,8M,8
+Alex,07/31/2004,08/06/2004,120,7.5M,1
+Bonnie,08/03/2004,08/14/2004,65,1.27M,3
+Charley,08/09/2004,08/15/2004,150,16.9B,35
+Earl,08/13/2004,08/15/2004,50,0,1
+Frances,08/24/2004,09/10/2004,145,10.1B,50
+Gaston,08/27/2004,09/03/2004,75,130M,8
+Ivan,09/02/2004,09/25/2004,165,26.1B,124
+Jeanne,09/13/2004,09/29/2004,120,7.94B,3037
+Karl,09/16/2004,09/28/2004,145,0,0
+Matthew,10/08/2004,10/11/2004,45,305K,0
+Arlene,06/08/2005,06/14/2005,70,11.8M,2
+Bret,06/28/2005,06/30/2005,40,9.3M,3
+Cindy,07/03/2005,07/12/2005,75,71.5M,0
+Dennis,07/04/2005,07/18/2005,150,3.98B,88
+Emily,07/11/2005,07/21/2005,160,1.01B,22
+Gert,07/23/2005,07/25/2005,45,6M,1
+Irene,08/04/2005,08/18/2005,105,0,1
+Jose,08/22/2005,08/23/2005,60,45M,16
+Katrina,08/23/2005,08/31/2005,175,125B,1392
+Maria,09/01/2005,09/14/2005,115,3.1M,1
+Nate,09/05/2005,09/13/2005,90,0,2
+Ophelia,09/06/2005,09/23/2005,85,70M,3
+Rita,09/18/2005,09/26/2005,180,18.5B,120
+Stan,10/01/2005,10/05/2005,80,3.96B,1668
+Tammy,10/05/2005,10/06/2005,50,30M,10
+Vince,10/08/2005,10/11/2005,75,0,0
+Wilma,10/15/2005,10/27/2005,185,22.4B,52
+Beta,10/26/2005,10/31/2005,115,15.5M,9
+Gamma,11/14/2005,11/22/2005,50,18M,39
+Delta,11/22/2005,11/30/2005,70,364M,7
+Epsilon,11/29/2005,12/10/2005,85,0,0
+Zeta,12/30/2005,01/07/2006,65,0,0
+Alberto,06/10/2006,06/19/2006,70,420K,3
+Beryl,07/18/2006,07/21/2006,60,0,0
+Chris,08/01/2006,08/04/2006,65,0,0
+Debby,08/21/2006,08/26/2006,50,0,0
+Ernesto,08/24/2006,09/01/2006,75,500M,11
+Florence,09/03/2006,09/19/2006,90,200K,0
+Gordon,09/10/2006,09/24/2006,120,3.8M,0
+Helene,09/12/2006,09/24/2006,120,0,0
+Isaac,09/27/2006,10/02/2006,85,0,0
+2007 Subtropical Storm Andrea,05/09/2007,05/14/2007,60,0,6
+Barry,06/01/2007,06/05/2007,60,118K,1
+Chantal,07/31/2007,08/05/2007,50,24.3M,0
+Dean,08/13/2007,08/27/2007,175,1.66B,45
+Erin,08/15/2007,08/20/2007,40,248M,21
+Felix,08/31/2007,09/07/2007,175,720M,130
+Gabrielle,09/08/2007,09/11/2007,60,0,1
+Humberto,09/12/2007,09/14/2007,90,50M,1
+2007 Tropical Depression Ten,09/21/2007,09/22/2007,35,6.2M,0
+Lorenzo,09/25/2007,09/28/2007,80,92M,6
+Noel,10/28/2007,11/07/2007,80,580M,222
+Olga,12/11/2007,12/17/2007,60,45M,40
+Arthur,05/31/2008,06/06/2008,45,78M,5
+Bertha,07/03/2008,07/21/2008,125,0,3
+Cristobal,07/19/2008,07/23/2008,65,10K,0
+Dolly,07/20/2008,07/27/2008,100,1.6B,22
+Edouard,08/03/2008,08/06/2008,65,550K,6
+Fay,08/15/2008,08/29/2008,70,560M,36
+Gustav,08/25/2008,09/07/2008,155,8.31B,153
+Hanna,08/28/2008,09/12/2008,85,160M,537
+Ike,09/01/2008,09/15/2008,145,38B,214
+Kyle,09/25/2008,09/30/2008,85,57.1M,8
+Laura,09/29/2008,10/04/2008,60,0,0
+Marco,10/06/2008,10/07/2008,65,0,0
+Omar,10/13/2008,10/21/2008,130,80M,1
+Paloma,11/05/2008,11/14/2008,145,455M,1
+Ana,08/11/2009,08/16/2009,40,0,0
+Bill,08/15/2009,08/26/2009,130,46.2M,2
+Claudette,08/16/2009,08/18/2009,60,350K,2
+Danny,08/26/2009,08/29/2009,60,0,1
+Erika,09/01/2009,09/04/2009,50,35K,0
+Fred,09/07/2009,09/19/2009,120,0,0
+Grace,10/04/2009,10/07/2009,65,0,0
+Ida,11/04/2009,11/11/2009,105,11.4M,4
+Alex,06/25/2010,07/06/2010,110,1.52B,51
+2010 Tropical Depression Two,07/08/2010,07/10/2010,35,0,0
+Bonnie,07/22/2010,07/25/2010,45,1.36M,1
+Colin,08/02/2010,08/09/2010,60,0,1
+2010 Tropical Depression Five,08/10/2010,08/18/2010,35,1M,2
+Earl,08/25/2010,09/05/2010,145,45M,8
+Hermine,09/03/2010,09/10/2010,70,740M,52
+Igor,09/08/2010,09/23/2010,155,200M,4
+Julia,09/12/2010,09/28/2010,140,0,0
+Karl,09/14/2010,09/18/2010,125,3.9B,22
+Matthew,09/23/2010,09/28/2010,60,171M,126
+Nicole,09/28/2010,09/30/2010,45,245M,20
+Otto,10/06/2010,10/18/2010,85,22.5M,0
+Richard,10/20/2010,10/27/2010,100,80M,1
+Shary,10/27/2010,10/31/2010,75,0,0
+Tomas,10/29/2010,11/11/2010,100,463M,44
+Arlene,06/28/2011,07/01/2011,65,223M,18
+Bret,07/17/2011,07/23/2011,70,0,0
+Emily,08/02/2011,08/11/2011,50,5M,4
+Harvey,08/19/2011,08/22/2011,65,0,5
+Irene,08/21/2011,08/30/2011,120,14.2B,58
+Katia,08/29/2011,09/13/2011,140,157M,3
+Lee,09/02/2011,09/07/2011,60,2.8B,18
+Maria,09/06/2011,09/18/2011,80,1.3M,0
+Nate,09/07/2011,09/12/2011,75,0,4
+Ophelia,09/20/2011,10/07/2011,140,0,0
+Rina,10/23/2011,10/29/2011,115,2.3M,0
+Beryl,05/26/2012,06/02/2012,70,148K,1
+Debby,06/23/2012,06/30/2012,65,250M,5
+Ernesto,08/01/2012,08/10/2012,100,252M,12
+Helene,08/09/2012,08/18/2012,45,17M,2
+Isaac,08/21/2012,09/03/2012,80,3.11B,41
+Leslie,08/30/2012,09/12/2012,80,10.1M,0
+Nadine,09/10/2012,10/04/2012,90,0,0
+Rafael,10/12/2012,10/26/2012,90,2M,1
+Sandy,10/22/2012,11/02/2012,115,68.7B,233
+Andrea,06/05/2013,06/10/2013,65,86K,1
+Barry,06/17/2013,06/20/2013,45,0,5
+Chantal,07/07/2013,07/10/2013,65,10M,1
+Ingrid,09/12/2013,09/17/2013,85,1.5B,32
+Karen,10/03/2013,10/15/2013,65,18K,0
+Arthur,07/01/2014,07/09/2014,100,39.5M,2
+Bertha,08/01/2014,08/16/2014,80,0,4
+Dolly,09/01/2014,09/04/2014,50,22.2M,1
+Fay,10/10/2014,10/13/2014,80,3.8M,0
+Gonzalo,10/12/2014,10/20/2014,145,317M,6
+Ana,05/08/2015,05/12/2015,60,0,1
+Bill,06/16/2015,06/23/2015,60,100M,8
+Danny,08/18/2015,08/24/2015,125,0,0
+Erika,08/24/2015,09/03/2015,50,511M,35
+Fred,08/30/2015,09/06/2015,85,2.5M,9
+Joaquin,09/28/2015,10/15/2015,155,200M,34
+Kate,11/08/2015,11/13/2015,85,0,0
+Alex,01/12/2016,01/17/2016,85,0,1
+Bonnie,05/27/2016,06/09/2016,45,640K,2
+Colin,06/05/2016,06/08/2016,50,1.04M,6
+Hermine,08/28/2016,09/08/2016,80,550M,4
+Julia,09/13/2016,09/21/2016,50,6.13M,0
+Matthew,09/28/2016,10/10/2016,165,16.5B,603
+Nicole,10/04/2016,10/20/2016,140,15M,1
+Otto,11/20/2016,11/26/2016,115,192M,23
+Bret,06/19/2017,06/20/2017,50,3M,1
+Cindy,06/20/2017,06/24/2017,60,25M,2
+Emily,07/30/2017,08/02/2017,60,10M,0
+Franklin,08/07/2017,08/10/2017,85,15M,0
+Gert,08/12/2017,08/18/2017,110,0,2
+Harvey,08/17/2017,09/02/2017,130,125B,107
+Irma,08/30/2017,09/13/2017,180,77.2B,52
+Jose,09/05/2017,09/25/2017,155,2.84M,1
+Katia,09/05/2017,09/09/2017,105,3.26M,3
+Maria,09/16/2017,10/02/2017,175,91.6B,3059
+Nate,10/04/2017,10/11/2017,90,787M,48
+Ophelia,10/09/2017,10/18/2017,115,87.7M,3
+Philippe,10/28/2017,10/29/2017,40,100M,5
+2017 Potential Tropical Cyclone Ten,08/27/2017,09/03/2017,45,1.92M,2
+Alberto,05/25/2018,06/01/2018,65,125M,18
+Beryl,07/04/2018,07/17/2018,80,1M,0
+Chris,07/06/2018,07/17/2018,105,0,1
+Florence,08/31/2018,09/18/2018,150,24.2B,24
+Gordon,09/03/2018,09/08/2018,70,200M,3
+Kirk,09/22/2018,09/28/2018,65,440K,2
+Leslie,09/23/2018,10/16/2018,90,500M,17
+Michael,10/07/2018,10/16/2018,160,25.5B,74
+Barry,07/11/2019,07/19/2019,75,900M,3
+Dorian,08/24/2019,09/10/2019,185,5.1B,84
+Fernand,09/03/2019,09/05/2019,50,11.3M,1
+Humberto,09/13/2019,09/20/2019,125,25M,2
+Imelda,09/17/2019,09/19/2019,45,5B,7
+Karen,09/22/2019,09/27/2019,45,3.53M,0
+Lorenzo,09/23/2019,10/07/2019,160,367M,19
+Melissa,10/11/2019,10/14/2019,65,24K,0
+Nestor,10/18/2019,10/21/2019,60,150M,3
+Pablo,10/25/2019,10/29/2019,80,0,0
+Olga,10/25/2019,10/27/2019,45,400M,2
+Arthur,05/16/2020,05/21/2020,60,112K,0
+Bertha,05/27/2020,05/28/2020,50,130K,1
+Amanda and Cristobal,06/01/2020,06/12/2020,60,865M,46
+Fay,07/09/2020,07/12/2020,60,220M,6
+Hanna,07/23/2020,07/26/2020,90,1.2B,9
+Isaias,07/30/2020,08/05/2020,90,5.03B,17
+Laura,08/20/2020,08/29/2020,150,23.3B,81
+Marco,08/21/2020,08/26/2020,75,35M,0
+Paulette,09/07/2020,09/28/2020,105,50M,2
+Sally,09/11/2020,09/18/2020,110,7.3B,4
+Teddy,09/12/2020,09/24/2020,140,35M,3
+2020 Subtropical Storm Alpha,09/17/2020,09/19/2020,50,24.2M,1
+Beta,09/17/2020,09/25/2020,65,225M,1
+Gamma,10/02/2020,10/06/2020,75,100M,6
+Delta,10/04/2020,10/12/2020,140,3.09B,6
+Epsilon,10/19/2020,10/26/2020,115,0,1
+Zeta,10/24/2020,10/30/2020,115,4.4B,9
+Eta,10/31/2020,11/14/2020,150,8.3B,175
+Iota,11/13/2020,11/18/2020,155,1.4B,84
+Claudette,06/19/2021,06/23/2021,45,375M,4
+Danny,06/27/2021,06/29/2021,45,5K,0
+Elsa,06/30/2021,07/10/2021,85,1.2B,13
+Fred,08/11/2021,08/20/2021,65,1.3B,7
+Grace,08/13/2021,08/21/2021,120,513M,16
+Henri,08/15/2021,08/21/2021,75,700M,2
+Ida,08/26/2021,09/05/2021,150,75.3B,107
+Larry,08/31/2021,09/12/2021,125,80M,5
+Mindy,09/08/2021,09/11/2021,60,75.2M,23
+Nicholas,09/12/2021,09/20/2021,75,1.1B,2
+Alex,06/05/2022,06/07/2022,70,0,4
+Bonnie,07/01/2022,07/11/2022,115,25M,5
+Danielle,09/01/2022,09/15/2022,85,0,0
+Earl,09/02/2022,09/15/2022,110,0,2
+Fiona,09/14/2022,09/27/2022,140,3.09B,29
+Ian,09/23/2022,10/01/2022,160,113B,161
+Hermine,09/23/2022,09/26/2022,40,9.8M,0
+Julia,10/07/2022,10/10/2022,85,406M,35
+Nicole,11/07/2022,11/11/2022,75,1B,11
diff --git a/p5/images/README.md b/p5/images/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..17b4c7caf68d1eff88b8b32a73ce1d756731e398
--- /dev/null
+++ b/p5/images/README.md
@@ -0,0 +1,3 @@
+# Images
+
+Images from p5 are stored here.
diff --git a/p5/images/add_group_member.png b/p5/images/add_group_member.png
new file mode 100644
index 0000000000000000000000000000000000000000..402e5962e3e54ce8349f60ccfe4ce2b60840dd3b
Binary files /dev/null and b/p5/images/add_group_member.png differ
diff --git a/p5/images/gradescope.png b/p5/images/gradescope.png
new file mode 100644
index 0000000000000000000000000000000000000000..7441faae41d8eb98bfceeb78855b67896b1ff911
Binary files /dev/null and b/p5/images/gradescope.png differ
diff --git a/p5/images/summary.png b/p5/images/summary.png
new file mode 100644
index 0000000000000000000000000000000000000000..4a63e32ff1a29903584746aa4873373855558e7b
Binary files /dev/null and b/p5/images/summary.png differ
diff --git a/p5/images/table.PNG b/p5/images/table.PNG
new file mode 100644
index 0000000000000000000000000000000000000000..eb8f8ee7f0d75373fb58ab32f478e1b36655bb6a
Binary files /dev/null and b/p5/images/table.PNG differ
diff --git a/p5/p5.ipynb b/p5/p5.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..1e0399e9f6371d46fc6276a551c41e170c5eb86f
--- /dev/null
+++ b/p5/p5.ipynb
@@ -0,0 +1,2546 @@
+{
+ "cells": [
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "9b3ef23d",
+   "metadata": {
+    "cell_type": "code",
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "# import and initialize otter\n",
+    "import otter\n",
+    "grader = otter.Notebook(\"p5.ipynb\")"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "608620fe",
+   "metadata": {
+    "editable": false,
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:41.279817Z",
+     "iopub.status.busy": "2023-10-04T01:14:41.279817Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.308930Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.308930Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "import public_tests"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "feb762da",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.312936Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.312936Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.316033Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.316033Z"
+    }
+   },
+   "outputs": [],
+   "source": [
+    "# PLEASE FILL IN THE DETAILS\n",
+    "# enter none if you don't have a project partner\n",
+    "# you will have to add your partner as a group member on Gradescope even after you fill this\n",
+    "\n",
+    "# project: p5\n",
+    "# submitter: NETID1\n",
+    "# partner: NETID2\n",
+    "# hours: ????"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "0c018208",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "# Project 5: Investigating Hurricane Data"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "eb9d6c94",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Learning Objectives:\n",
+    "\n",
+    "In this project you will demonstrate how to:\n",
+    "- write fundamental loop structures,\n",
+    "- perform basic string manipulations,\n",
+    "- create your own helper functions as outlined in Lab-P5.\n",
+    "\n",
+    "**Please go through [Lab-P5](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/lab-p5) before working on this project.** The lab introduces some useful techniques related to this project."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "47697cfa",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Testing your code:\n",
+    "\n",
+    "Along with this notebook, you must have downloaded the file `public_tests.py`. If you are curious about how we test your code, you can explore this file, and specifically the function `get_expected_json`, to understand the expected answers to the questions."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e9754830",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Project Description:\n",
+    "\n",
+    "Hurricanes often count among the worst natural disasters, both in terms of monetary costs, and more importantly, human life. Data Science can help us better understand these storms. For example, take a quick look at this FiveThirtyEight analysis by Maggie Koerth-Baker: [Why We're Stuck With An Inadequate Hurricane Rating System](https://fivethirtyeight.com/features/why-were-stuck-with-an-inadequate-hurricane-rating-system/)\n",
+    "\n",
+    "For this project, you'll be analyzing data in the `hurricanes.csv` file. We generated this data file by writing a Python program to extract data from several lists of hurricanes over the Atlantic Ocean on Wikipedia (here is an [example](https://en.wikipedia.org/wiki/2022_Atlantic_hurricane_season)). You can take a look at the script `gen_csv.ipynb` yourself. At the end of the semester, you will be able to write it yourself. \n",
+    "\n",
+    "We won't explain how to use the `project` module here (the code in the `project.py` file). Refer to [Lab-P5](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/lab-p5) to understand how the module works. If necessary, use the `help` function to learn about the various functions inside `project.py`. Feel free to take a look at the `project.py` code, if you are curious about how it works.\n",
+    "\n",
+    "This project consists of writing code to answer 20 questions."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "d0388404",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Dataset:\n",
+    "\n",
+    "The dataset you will be working with in this project is linked [here](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/tree/main/p5/hurricanes.csv). Be sure to look at this csv to see what it contains, and specifically what the names of the columns are.\n",
+    "\n",
+    "If needed, you can open the `hurricanes.csv` file, to verify answers to simple questions, but you must still have the correct code in your submission!"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "6a7614b2",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Project Requirements:\n",
+    "\n",
+    "You **may not** hardcode indices in your code unless specified in the question. If you hardcode the value of `project.count()`, the Gradescope autograder will **deduct** points. If you are not sure what hardcoding is, here is a simple test you can use to determine whether you have hardcoded:\n",
+    "\n",
+    "*If we were to change the data (e.g. add more hurricanes, remove some hurricanes, or swap some columns or rows), would your code still find the correct answer to the question as it is asked?*\n",
+    "\n",
+    "If your answer to that question is *No*, then you have likely hardcoded something. Please reach out to TAs/PMs during office hours to find out how you can **avoid hardcoding**.\n",
+    "\n",
+    "**Store** your final answer for each question in the **variable specified for each question**. This step is important because Otter grades your work by comparing the value of this variable against the correct answer.\n",
+    "\n",
+    "For some of the questions, we'll ask you to write (then use) a function to compute the answer.  If you compute the answer **without** creating the function we ask you to write, the Gradescope autograder will **deduct** points, even if the way you did it produced the correct answer.\n",
+    "\n",
+    "Required Functions:\n",
+    "- `get_month`\n",
+    "- `get_day`\n",
+    "- `get_year`\n",
+    "- `format_damage`\n",
+    "- `deadliest_in_range`\n",
+    "- `get_year_total`\n",
+    "    \n",
+    "Students are only allowed to use Python commands and concepts that have been taught in the course prior to the release of P5. Therefore, **you should not use concepts/modules such as lists, dictionaries, or the pandas module, to name a few examples**. Otherwise, the Gradescope autograder will **deduct** points, even if the way you did it produced the correct answer.\n",
+    "\n",
+    "For more details on what will cause you to lose points during code review and specific requirements, please take a look at the [Grading rubric](https://git.doit.wisc.edu/cdis/cs/courses/cs220/cs220-f23-projects/-/blob/main/p5/rubric.md)."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "63006afc",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Incremental Coding and Testing:\n",
+    "\n",
+    "You should always strive to do incremental coding. Incremental coding enables you to avoid challenging bugs. Always write a few lines of code and then test those lines of code, before proceeding to write further code. You can call the `print` function to test intermediate step outputs.\n",
+    "\n",
+    "We also recommend you do incremental testing: make sure to run the local tests as soon as you are done with a question. This will ensure that you haven't made a big mistake that might potentially impact the rest of your project solution. Please refrain from making multiple submissions on Gradescope for testing individual questions' answers. Instead use the local tests, to test your solution on your laptop.\n",
+    "\n",
+    "That said, it is **important** that you check the Gradescope test results as soon as you submit your project on Gradescope. Test results on Gradescope are typically available somewhere between 10 to 20 minutes after the submission."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4d776143",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Project Questions and Functions:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "9bc58344",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.320041Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.320041Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.395082Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.394041Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# it is considered a good coding practice to place all import statements at the top of the notebook\n",
+    "# please place all your import statements in this cell if you need to import any more modules for this project\n"
+   ]
+  },
+  {
+   "attachments": {
+    "table.PNG": {
+     "image/png": "iVBORw0KGgoAAAANSUhEUgAAAuwAAACcCAYAAAAknf4UAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAEnQAABJ0Ad5mH3gAADWrSURBVHhe7Z3bjhU9c4bnouZormWO5z7mDuY0/AiBGERAhCAIIAIi/IgVhBAIkYgTznMbTnvX7U3Zrt63u99HKn10t9uuncs1i/UxZ//3f/8nIBAIBAKBQCAQyDYFDTsEAoFAIBAIBLJhQcMOgUAgEAgEAoFsWNCwQyAQCAQCgUAgGxY07BAIBAKBQCAQyIYFDTsEAoFAIBAIBLJhQcMOgUAgEAgEAoFsWNCwQyAQCAQCgUAgGxY07BAIBAKBQCAQyIYFDTsEAoFAIBAIBLJh2UzD/j//8z/kfQgEAjm6oD6uJ/D9vqSGeCLneHIUP1k7N9Ow//r1i7wPgUAgRxfUx/UEvt+X1BBP5BxPjuIna+dmGvafP3+S9yEQCOTogvq4nsD3+5Ia4omc48lR/GTt3EzD/v37d/I+BAKBHF1QH9cT+H5fUkM8kXM8OYqfrJ2rNezhd4++ffvmXUMgEAhEC+rjegLf70tqiCdyjidH8ZO1c5WGXTbr//Ef/+Hd++///m/vGgKBQCBaUB/XE/h+X1JDPJFzPDmKn6ydizfstlkPG/YvX7541xDIanJ7Kc7OzsT59Yl+DoEsLNXVR7OHQrm8JcZuXI5wNp2uz4l4XYpbYmztMjaetK+m9ddmci6xj7eSG0v6KRX3JWqatXPRht1t1sOG/Z///Kd3DYGsIydxfd406+fN5jy/FidyDASyrFRXH9VB7x/q+sA7F9cnZ1wFsuuz6XQtzmXjQdS628v6YsWRsfHUeRw3rG1Dd3nr3R8im8k5Yh9vSZb0k4rvSj2BtXOxhj1s1sOG/b/+67+8awhkFVEHWFOg1H/3eWBB6pPq6iN50N+Ky6ahqe1vrvZ7NtUZj7EyNp6phl2J+QForE83k3Mbb9iX9NOaDbu1c5GGnWrWw4b9w4cP3jUEsoaoTWk+Ibm9xNdiINuQ6upjpmGv7Wsxez2bZH074t8ijo1ntmFvZAq/bibnNt6wL+mnNRt2a+fsDXuqWZfijvvP//xP7xoCWV6ChmLjxQpyHKmuPhJ7p9YGcZ9n0zE/XZcyNp6lhl3n/ri/nd1Mzm38DFzST2s27NbOWRv2XLMuxR379u1b7xoCWVyi4lTnJ4KQ/Ul19VHtpaZB96TOH353eTaZr24csbaNjecSDftmco7cx41M8D39KWRJP+m4h75YpqZZO2dr2EvNuhR3fHgNgSwt6hPAoBBR9yCQpaW6+kh9MjfR93uXll2eTQdu2MfGc4mGfTM5R+3jDcmSflrzE3Zr5ywNO6dZl+K+8/LlS+8aAllUzAEW/wQtBf/zKWRdqa4+Jg76YrOzQdnn2XTcr8SMjWcph6fI8c3k3MYb9iX9tGbDbu2cvGHnNutS3Pf+/d//3buGQJaUdJE97sEG2Y5UVx9TB/0Enz4uLfs8m/Q/X1vj/1MwVsbGM9+QG7+O/FvZzeTcxhv2Jf20ZsNu7Zy0Ye/TrEtx3/23f/s37xoCWU7yRfao/5oCZDtSXX3c0Sfsuz2bVIyO92HE2Hhmc3iiH0g3k3Mbb9iX9NOaDbu1c7KGvW+zLsV9/+nTp941BLKYlL7PaQ42/M+nkLWkuvpIHfSVNoh7Ppt080l/WLHXX5w0Np6phl19sDPRObGZnNt4w76kn9Zs2K2dkzTsQ5p1Ke4c//qv/+pdQyBLiS60uaI0zV9zQiBDpbr6aJrzUGr8oXf/Z5P+2l8Ur53Wu7HxbH/ImdFfm8m5xD6WsoW9vKSf1mzYrZ2jG/ahzboUd55Hjx551xAIBALRgvq4nsD3+5Ia4omc48lR/GTtHNWwj2nWpbhzPXz40LuGQCAQiBbUx/UEvt+X1BBP5BxPjuIna+ek/9PpGLl//z55HwKBQI4uqI/rCXy/L6khnsg5nhzFT9bOzTTs9+7dI+9DIBDI0QX1cT2B7/clNcQTOceTo/jJ2rmZhv0f//gHeR8CgUCOLqiP6wl8vy+pIZ7IOZ4cxU/WzrM/f/6I//3f/xW/f/8Wv379Ej9//hQ/fvwQ379/F9++fRNfv34Vp9NJfPnyRXz+/Fl8+vRJfPz4UXz48EG8f/9evHv3Trx9+1a8fv1avHr1Sv1GphcvXojnz5+LZ8+eqX+O5smTJ+Lx48fqi/PyuzgPHjxQH/HLnxru3r0r7ty5I/7lX/4FAoFAIBAIBAKBBLKZT9ilMgAAAGLkhypgHeD7fVFDPJFzPI7iJ2snGnYAANg4OMDXA77fFzXEEznH4yh+snaiYQcAgI2DA3w94Pt9UUM8kXM8juInaycadgAA2Dg4wNcDvt8XNcQTOcfjKH6ydqJhBwCAjYMDfD3g+31RQzyRczyO4idrJxp2AADYODjA1wO+3xc1xBM5x+MofrJ2TtOw316Ks7NLcUs9YwoadgAAoEkeTH9vxMXZWVN/G7m4EX/N7Zp4c7Vt3blNgW/HX3FzMYVdU80zP1uPo2XpJk/55eqNueIxLOeOx1b9NCTmOaydoxr228tGKXtYoGEHAIBZoA8m08xNeDCswdabjmFNARr2rcKN51QMad622ohuja36aUjMc1g7hzfsp2txfnYurk/Nn/EJOwAAzAZ9ML0RV2dn4uKm7uN6603HbpunN1fNuX0hpkqfWuznxnMqhjRvu825iVndT4k9NCTmOayd+EoMAABsHPpgQsO+BLttntCwL8KQ5m23OTcxq/sJDXuHNfrvzUUzf/NnKYTT1Tj7XErgqOw8yuHBPRf3eWoMAADMSHQwhXWpEa9xd7/bbiQ8P2xdbOunqW1z10uvXtv1N1xXU01ByY7IriAmXjwyz6h5Zay9+JxdNT+++Xj6SXEmjZ414uXPDuNooeKpdG9s6Jvz8r1SLLJzJxiacxJvjBnnktWnsL9Lc7df4XLHKHF8wsgtLmP8xNEjZ2/0rBG7h9SzlI8tuXoQYO3cfsMujWktiT9RkmO8QmOc4Bqfmqc0t3a2u/nq+T4hAGA/0AdT4hN2cxB5B4C5F9ZOvwZq5qyXeu5uTHugbbimJhu8gh1qTHsd+kv6xr6fexbOY9dupI0F7WcvL4hzsRnUzBN/OrjXOFrS8WykZ85zY1GaJ2RMzpXiPsbW/NzGdmcxWud8bvVhqJ+4OV7yZTOomSfxCXvWn+G1XN/Vx8faWcUn7C7qXja4iaQJ3inf0w4NlpeDyAABAMBcUAeTrVHeoULUP4uqb84hFV5b5quXlL703Fsi9j3PDso/4Tua3LPSvIbiuUTkBfnOfuNoSTZ5ge6ce5xYqDGBQ8n3HIbmXMxU/RBFODeRO0Te5HOrH8P8NFQP7h4y6wULUDqk9nyItXMHDbtxYmO8J8578TuMe+anqWheJcOSCwAAhkA1GU3FIop+4jCSMBoJyWz1kvqEqoGae0tEvmfaEV63n/QR50fuGTlvuHikU/lcbCaK19txHC3UXqJ059zjxIIaQ83tMjTnOHGn1ubdK80dN7RqDvuhACe3ejLIT2w9Bu6hBrVeoEToz9yeD7F2Vt6wG4dSSRUmjTeGcS8ReAAAWBqqyWgqVtyw5+qWOlyCT9iJgbPVS85hukGGNk8pu9T95v3k324Ez8h5w8U9nXjnYjNRs07QLOw4jhZqL1G6c+6VY0GPoeZ2GZZzM/ZDdp7s3Oa60bMTJ8c5udWTQX5i6cHzZTNxvIca1HrBApSPJep+6KsAa2fdDTvp+NiplKPK9/r9lQUAAMwF1WTQNUrfC+umJKx56poxTjJNvaTHUHNvidj3PDuyduWahuAZa163cWCei947LfuNo4XaS5TunHvFWKjLZkwQaPI9h0E5N2c/xJlbjYmb1w5ObvVj2N5k6DFqD5n1/JdJH7eQ63VYOyv/hF07PkpG6l4uGQ3hPf1XFkEwZIBSTgcAgBmgmoymGJEHj/2rVq90qoPFv6fqXVBfJXPWS/VO+OmxrNcbrqnJBq9gh+cf6Qff+Z2vcs/UJTGvXKt9JzwHeediqknYaxwtyXgGunPutXaXfB84mZrbZVjOzdkPceY2Y0Jx5p26pxq6N8t6jNtDamxwU92z8xf2fIi1s/7vsBuHtcnRjA/fi95p4N6zh18rwXMAAJgb6mBqKpY6VMhPitQB4NQt4jAI66Rl3nppPqUyz6Xu6p0N11Xa92U7Qv+oazNein9eF56F8zQDPF+H/mOcixJ3XffRHuNoSTZ5ge6ce9anuVjYMS7U3C5Dc44Td2pt1r3i3Fo/31SjszNPObf4DPZTQ1GPEXsoOc5Zw33PfZfC2jlNwz6BUA07AACA1MEElmBrvqeaAcBnynjOFYsa97tugLtPti2p+1NwlLpo7UTDDgAAG+coB9MW2Zrv0bCPY8p4omHvsJ9Y++6Iv1oyJUepi9ZONOwAALBxjnIwbZGt+R4N+zimjCcadp/oayaNyK+kzMVR6qK1Ew07AABsnKMcTFsEvt8XNcQTOcfjKH6ydqJhBwCAjYMDfD3g+31RQzyRczyO4idr59mfP3/Uxe/fv8WvX7/Ez58/xY8fP8T379/Ft2/fxNevX8XpdBJfvnwRnz9/Fp8+fRIfP34UHz58EO/fvxfv3r0Tb9++Fa9fvxavXr0SL1++FC9evBDPnz8Xz549E0+fPhVPnjwRjx8/Fo8ePRIPHz4UDx48EPfv3xf37t0Td+/eFXfu3FENu9QDAoFAIBAIBAKBdIJP2AEAYOPIYg3WAb7fFzXEEznH4yh+snaiYQcAgI2DA3w94Pt9UUM8kXM8juInaycadgAA2Dg4wNcDvt8XNcQTOcfjKH6ydqJhBwCAjYMDfD3g+31RQzyRczyO4idrJxp2AADYODjA1wO+3xc1xBM5x+MofrJ2omEHAICNs+TBpH4ZzMWNmO/XnSyzxlTA9/uihibvKI3oWI7iJ2vniIb9VlwGv9Hq8pYaxxM07PWxZOEvraWeL/7b//6Km4vlfACOy3IHk/5V4vNupSXWmA74fl/U0OQdpREdy1H8ZO0c3LDfXp6L65Nz7/ZyVNOebdjfXOkfChJVRjVr8rkVooGKfmVupmK1Y0uNmNXr7KopgyG6MM75a3lTUL8eWMnEjSUadjTsYBlSBxOnroX1MVeT9Hx+PZu6dlJrKDZa59f0vaLgFwVnTENtvp+DwU3eBHHg+iir4wRrlMaoeIexMeuu0dOkGBzLpWDkTLS3iLHWzgm/EnMS1+fNYpe3xLOy0A27aYgyhmhj3QKkG2U32XRyXoguz8wYyol/b8RFM/bqqnmnUEy8pI/mWrthJ4ryxJCbeiZKa6nnmU0BQM1QBxOnrkX1MXvomnrrvD997YzXaO/JeRNzr1nn1/I9xy+8MZb6fD8H/Zu8aeLQx0e0jtOswc5dNzYqZvSaa9I/lkvRI2cYe8DaOel32G8vm8UnbNi7xDKGRUbT9/V7tsDoZAyLpD/GoueTY9XzgiPtHDetnuaBgl53CWjbpifa1DNSWks931gxAWAq4oOJUdfMIRuOSe4lM77bRjPUzmgNO9926/w6vrfz5fzCG9NSoe/noG+Tx/FxeUwfH9E6TrMGTw8/T/U7a8SqRN9YLkU5VunYU1g7J/+E/fz6RDwrS/477Gmj4wIYjCWKlMJ82uHedx2o/lxI0G68WdMbT28Mu24rzjtkANV4GXhzLUnZZOAkgm1w9dhYF42xyz5vRc9NHT7qnjs2UJK3bjBPM55ay6U8L51D4bx2nnZ98yx7P9DL06GRMAfaOaxQOmVtMWRyCeyL6GDi1DWixkl0XsX1Qd13c2iG2pl71jzdZJ1fxfceab90lMfU6Ps5GN7kjYhDDx9J8jqOWIOpRxdzsxaRs1tgeCyXIpUz+n7UHyawdk7WsJ+uz5ugBt9r7yFDG/YmtfRPf6qptX92kkslYtDwSsLEDa45xcQtPvHGixt2b7zC2GXXITaT3DgXFxd+YNVa6Q0UrxOjNmSzVufTUN/Y5/qdbt6wkCtdXT0T9uTXjdfR9vhrhZTnpXOIssGfR5O977yvdXXzTa7bXcvxU/iomEtgV0QHE6euEbmlIN+Nc4y1BnGtcpPMQ2IND3qPavS7Wh/7Zyf/J9e1YxXfe+T8YimNqdP3czC8yRsRB66PDHkdR6zB1MOeazfqLHLPmW0xPJZLkcoZsx+vrpTf1XmvhIhNg7VzkoZdN+tz/isxpY1ii4gWb1ifBHVe5BSTsGnSjZa9Dgukvo7j5uoX2imvm/nkGEeXUNcQrZf2RSe+D6g57CbVwwh9A1/64yniuHHXDQ+W0lrleekcCudV10SByt4P1kgfihTDfRQMkYOiOIN9MKhptLnl7RtzL3yXmo+1hhzm52uydhbzk96jHTrvbU3zhk2tq8Mqvvco+UVSGFOp7+dgeJM3Ig5MH1nyOo5Yo0+smmslK8SIy/BYLkUqZ8x+IutD3GdYO0c27OZ/NB3xybqVoQ27bU7to7ZZ7TKvueYkse8kTjHRaznvmTl1wxY0nuZZuwk86fTz1pXvqD/Luew6el7CFS2RXgRh8ZSoe63Nsc/1Ju7m9cdLbMIFEs6RWzcoHJZ4LZ/ivFa37Bh6Hkn2fkF3n+l8FM2hhMh1UD3DmkaJ32hJubqK652XX5aJaye5hsc26/wqvvdI+6UjP6ZW38/B8CZvRBzYOaPJ6zhiDaYebb6Y+3mb12N4LJcilTNBf2hR8Unnw4iG3fw77OfX4kQ+7yeDGnaTTFmjExvCTVyVnNKWhKQ+MdWFiyhCat4gICk9QtQ4rZecS7+v7VfvKr39NUMovUKUzYEyflE3Pnf8kD9ozHivAMdxK66b8JO/VkxxXkIXSTgvNY8ke7+ge8e8PgL7JTqYUjmQOpAd4kYp8SEAYw2Vl82YlHS1M7GGB71HrR5r1flVfO+R8ItHbky9vp+D4U3eiDj0zJm8jiPWYOrhnTdurDfG8FguRSpngv7QkoiPtXNww67+RZiJmnUpgxr2VCJ5RtOOKTW1cWGNoefQ651dXQXrJgIUoW2V4+SmsbaptZoL+98cJdskakMG83ibVPkwf/jE48NYxHErrpvwkz8mpjzv8DGS7P2C7i0z+wjsl/hgonOgvPe7+mJJvzNsDfU82KucmmR1i/bZynV+Hd+7JPzikR5Ts+/nYHiTNyYO/XyU13HMGjw9/PPGXBf6gTUYHsulSMXK3A/zP/jByWLtHNiw60/Xx3xnPZRhX4nRyRcarZMrSD7XCabYhEnrwikmqc2mnS7X9NfQ44NgyLHBOnrti0ZHZ241rvkhoPFDtl40JPVyUD4JJvI3qfFtKI6u5HhnTu134l52XfteGD9/TAhn3tD/1LzUPJLs/cwazZ0md+31dD7i5hLYB9TBpHPHyYFiXaMOiVRt1fRfw+RmjzU6tlnn1/J9B2dcaszYNdb1/RwMb/LGxKGfj/I6jluDPYbK1UJfsTTDY7kUmZwxfWL3KO4PLNbOYQ376Vqcy4lJGfZ9dqph100JtYabNMbI5HONTtJOcoVEwikmWj86ge16oe8jm6g1zAbyn1k7yxsm6TdnPqVfoJy/SXWi+UNM8pkx0aa2eltpXg7XKa8rscVBi4xVKR5TzUvNI8neD/QK/e/l2mQ+itfJ+QfUTepgUnnh5IBf1/x8VxLmsMlHIrVb8mvERHu1sEaUx61so86v5XuOX4pjKvf9HPRt8iaJg4HrI0rHKdcojVHPw9jYsytXLBambyyXghuraFzCt9bOkf/T6XSS/4QdLI1OpLgop+4DAOZjroNpif1ce82A7/fFVps8lxp03AJH8ZO1Ew07INGFPvxkJv1XNgCA+ZjnYNL7ufRJ6DiWWGNe4Pt9UUOTd5RGdCxH8ZO1Ew07SGKbdldQ/AFYHhzg6wHf74sa4omc43EUP1k70bADAMDGwQG+HvD9vqghnsg5Hkfxk7UTDTsAAGwcHODrAd/vixriiZzjcRQ/WTvP/vz5oy5+//4tfv36JX7+/Cl+/Pghvn//Lr59+ya+fv0qTqeT+PLli/j8+bP49OmT+Pjxo/jw4YN4//69ePfunXj79q14/fq1ePXqlXj58qV48eKFeP78uXj27Jl4+vSpePLkiXj8+LF49OiRePjwoXjw4IG4f/++uHfvnrh79664c+eOatilHhAIBAKBQCAQCKQTfMIOAAAbRxZrsA7w/b6oIZ7IOR5H8ZO1Ew07AABsHBzg6wHf74sa4omc43EUP1k70bADAMDGwQG+HvD9vqghnsg5Hkfxk7UTDTsAAGwcHODrAd/vixriiZzjcRQ/WTvRsAMAwMbBAb4e8P2+qCGeyDkeR/GTtRMNOwAAbJwlD6Y3V2fi7OJGzPo7OBdYYyrg+31RQ5N3lEZ0LEfxk7VzeMN+uhbn3m/BPBfXJ2IcU9Cwg76UDh71/OqNuVqKv+LmIq8XAH1Z7mDSv85+3m2zxBrTAd/vixqavKM0omM5ip+snYMb9tP1pdegn67PRzXt2Yb9zZX+oSBRZVRjJp9bIZql6Nfsc+fiVLa/N+LCfUfJVVMaNwyh88VNXS0mGnZwFFIHU7aukXWpE2pr6PmI2pWqwUusYVirzq/ne91c596L7EiMs9Tm+zkY3OQVfKQojOH6KKvjVHsxo6uKURhjM35LfcLgWM5Oee+6tHmR6BusnRN+JeZWXDYLXt5Sz8pCN+ym+bFGpxKLKnKO4doZF6LLMzPGm29Yo2WLT6iaWrPnXKNQm8m1MQ2ts7Z/8c3YQ+8Qsqg4qOe5XQJAJVAHE6+uxej3iKbN1kCqLsp5GXNbpl5jzTq/ju/1tTsdVbdLNdCnPt/PQf8mr+wjzpg+OUPryNEjJs45ZszdONkfBphrLkX/WC6B9q/rKmrvtijfXoirqyZOib1h7ZyuYT/pr8hM2bB3CW4SLLKWvu8nqN4UYSMaJnGc1GU6/cyNNeE2vuan5M3sOzTsABSJDyZeXYtJNwnNy+pQdh91NS5VgymmXmPdOr+W7yPMGHfdUg30qND3c9C3ySv7iDOmX85QOnL0iIlzjjOPn1dmDm6eLUjfWK4GsXc1OgbyvorLMg37SVyfNwG9vCWe8ST/HXZuYkmCsalC6DWundP4pHWisM2j+m+Y/EaXVoigte9ZcdaNnjVC22J05m484zt3XsrcnG6KYB77uKR3aV71PGOLet68owuUmSOXK4ZwXjtPq495lr0f6OXp0EgYn3YOK5ROWVsMjFwC9REdTKy6FtMd1uaGQ+7ASO0VijnWiPdUMHa2Or8F3xvMulGNZO7xGn0/B8ObvLSPOhJjeuZMXkeOHppczuXm6WJuxmzgBy2K4bFcGGLvSnR8tG9z+9PaOaJhN016o4SUoZ+sWxnasDeppX/6U0lp/+wkl9oQRMJ6G8j89Ht1pe5Zm9KJ3pAIQAq1AeScgQ1uwDTGVidw8l1vHWrzp+z0iH/aTkIVEnMvPDDyuhnftmOkfeX4cGzuigpN7HNKl9gf4bzxPJrsfef9uGjKdbtrOZ5la9YWu04+l0CdRAcTq66F0PmuifPJJ/euy1xr6He1zfbPC9T5hvV9bzD1152irQuO0PPU6fs5GN7kcfZAYkzPnMnryNFDUhqXfm7PsBuVX+6Zsi2Gx3JhiL0bxn7mht2X20sZ2Jn+p9Ni4tkiosUb1qOY+M4ya6aSldhoumHq9HDn04U1nEuvG5mV0rmF8EfxHYkpmsUKWdjI2Q0cvltYk6W3JNbJFpXUq+p5YIP/Dm1nOG/K5uz9YI2yz10Stmb1HJpLoAYmaRrVO7lnPWsOxaxrmDptxBvG8od536sZZs1MTVvf9xJK9xh7BkX1plLfz8HwJo+zBxJjeuZMXkeOHg25nFOk59Fnm5FCzq3J8FguCb13wzN90YZdimraB34tZmjDbguUfdQ2zfZGj2JCF7lEwpv3U42Y30zFwVGYOdqN4Ymrs7E/HOPOl7LTwyQOaZCLHkcOi9Yp69b9IEPol9S7PG/o4xDK5/47dF6xYteQvW/fTxRkH6atOT3ZuQRqZHzTaHIssV+8XCKh94rPfGusVucb1ve9GcPcx9R8tfp+DoY3eT32QDimZ87kdeyhx8CYt/li9MuvtR7DY7kc5N5V+eD/oKr2VSJe1s5JG3b1TzueX4sT8awkgxp2k0zZIpAqot4GShST1LuKfPMbFkh1ndA/MYWB2niEP1IFwYOziRtyenmJxtTNoBNXipOopN68eUMfh1A+99+hdQ3npeaRZO/b94sx7mFrTk9WLoFaiQ6mVLxTdUDdT+VH5gf0FnqveMy1hrF1nTq/vu/VPi/W9g493m0G6vX9HAxv8hh7oODHvI868jqO3YuW9Dze2cKaax2Gx3IZUntX309LuE+snXV/wp5KJG9z0IVCf0qQazwbEpvJouegn3sJb68jRRNFzIXc6IQ/Crpawk9LaLRexY3M1c0lfIfSmzlv6OMQyuecuHDGSLL32/cLMe5ja1ZPRi6BaokPJk5d61C5QtyXpN7xKezrhtnWWLnOr+l7/S5he5LYxpp9PwfDm7yEjzxSY/rlTF7Hsh65nOtIz6Ped2Kl51s2ThyGx3J++u5dlQvh/jBYOwc27LfiMmjMZ/3FSYVNEBoZJmuUbKbYeJsnKkxm7qy3jV5RUMz9MOGJufSGDTaC1KV9N9bDJoI3H9n4UaR01vNan2i9gjEMH0W6yXfcScICTerNs1ndSyS4RD33J47eCf3frsOIXfZ+Zg0dA3vdw9aetii8XAK1Qh1MOk/ivRQ2Bcn7ilRtDSmMm3UNs0eCPNb2z1/n1/I9WYNd5NyBT+J36vb9HAxv8ji+TI9h50xDXseCHtmccyno6sXcjC3+ELAsw2M5L8W9S6DeSZzV1s6BDbv/L8RouRS35FieUA27NToWN2nMpk8+1+jN0gmVzNF6XG+bQuQJVeAS80XrhkEzG7B93sxDzefaWFKd8m3kk8gup9hYGLqFvg91I/XmzptIcEk4XhK/YwuRFumDcONQ80iy9wO9Qn97vubaWrQlXifnH1APqYNJ5YATb6qu6TGJg9bkHpHGCqpOaPHnm3+N9er8Or6n7LVi67Bfu7QEa1Xu+zno2+RxfMTzI89HEkrHfmvE8bFw5lFzhGeHPadWiFmKvrFcBs7ejVFxSZzX1s5JvxIzRvKfsAMAwHGZ62DSh3f6cJ+CJdaYE/h+X2yzyfOpQcctcBQ/WTvRsAMAwMaZ52DSnwSlPuWbhiXWmBf4fl/U0OQdpREdy1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7UTDDgAAGwcH+HrA9/uihngi53gcxU/WzrM/f/6oi9+/f4tfv36Jnz9/ih8/fojv37+Lb9++ia9fv4rT6SS+fPkiPn/+LD59+iQ+fvwoPnz4IN6/fy/evXsn3r59K16/fi1evXolXr58KV68eCGeP38unj17Jp4+fSqePHkiHj9+LB49eiQePnwoHjx4IO7fvy/u3bsn7t69K+7cuaMadqkHBAKBQCAQCAQC6QSfsAMAwMaRxRqsA3y/L2qIJ3KOx1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7UTDDgAAGwcH+HrA9/uihngi53gcxU/WTjTsAACwcXCArwd8vy9qiCdyjsdR/GTtRMMOAAAbZ8mD6c3VmTi7uBGz/g7OBdaYCvh+X9TQ5B2lER3LUfxk7ZysYb+9bIrA2aW4JZ5xBA17jr/i5oJbZPuMnR91OFy9MVdLsS0fADCW5Q4m/evs592yS6wxHfD9vqihyTtKIzqWo/jJ2jlNw366FudNEZitYX9z1cydbvxUU6jWN5Jq1LLz6ELmzlMsamq+C3Eze1eIhr0faNjBvkgdTH9vLrya5e21vzfiwn0WCLUt9XxXTTUMyNTOqP4aSW37IWtI1qrz6/mep2tWj4DafD8Hg5u8go8UmTF99klOx2geZ4IuF3J9SReHi/mbl1kZHMspKOQDZ19y9661c4KG/SSuz8/E5eVls+DUDbtpvDLG6OQlipxXUErz6OfubbspEv7TqIAt0bDXi/Jj1okAgBLUwaSLvVt/TO0r7Ldk02brpPc+swazfzgescZKdX4d3/N07adHuIak5DO77kpn7Az0b/LKPmL7kblPaB3NGpk5dD5oHVLNOGdMLfSP5RSUY83Zl332rrVzdMN+uj4XZ+fX4nQ7fcPeGWQcFBlC39fvdQWmPA+B+YQkm9Bo2IuoIsXxNwAgSXww6eIe1qew9sVkGjpT89xHnNrZpxEZtsa6dX4t30dEuvbUo0Lfz0HfJo9jG2fM2IY99DmFHXOTHKv1u7i5IXOnNvrGcgrKsebsy35719o5rmE/ya/CnIvrU/PnGRr2jj6bID02/yyAU0w4DbuZp/1prBF/eVqn0K7IzmBe93WqMKh7znhyveaeThgzhigupXlCyvP2sL8Z065vnmXvB/p7OjQSxradwwqlU9YWg/2rstwYAHoQHUxE86UwuRfdN+jcpWuWepbM1T41OM10a+Rqee5ZAKPOr+97Q6hrTz1q9P0cDG/yOLb18WOaWEc9b8lXOseahi+bGzIH6YaxNobHcgoSsebsy55719o5omHXX4U5vz7p65Ua9sZClXhuEtI/WUo4G85QKL6KNvnNdQg1h7nXbRRap3Bz+9fhZpNzdDZT73obk0gW9U5zr9Mj3tCceULK8/aw35tHk73vvB8flnLd7lqOn8JHbcE01619ji4A9CU6mFK1J7sn6b2miXPZJ/1uuy8coecZvoZ9d406v77vDaGuvfSo0/dzMLzJ49iWHsPfJ5SOJn5XVyq23Rx+/Lvzh9ZD6aDuMXNu4wyP5RQkYs3Zlz1riLVzcMOu/lUY+VUYe2+1hl1ii4iW5DDWhpOY+UpNVsrpitLGtQUvs7HChrW9zm+28N2YeM1uI3cMmSekPG8P+4lDIns/WKNfcRriIx2XyB3ZPAGgzCRNo3on9yyXo+W9btFNA7HfRq+xTp1f3/cSQtc+elTq+zkY3uRxbOPaL8OU2CcNsY6Uv8xazvnnfWAUxtzLCz0fGvYxJGLN2Zd99m6DtXNYw0415ys17Dbp7SN7TW8Y3mbSTWCuuBmyRVBvCHIp7z1aJ78RjK9bO4n1w7HtGmq8I86a6p2CDpx5Qsrz9rCfWCd7376fO0hbJvCRWSeaQwkjnwBIML5pNPnt7ecOL49J6H2agppvzBpr1vn1fZ/QtYcetfp+DoY3eRzbePZbUnGJdUw02CoHwtjYBt5/Rz1r10LDPp5ErDn7smcNsXYOaNj1V2HUhk1I+zWZHjKoYTfGlZK4o7yZehWSlNMlyeLdoN6zm4rWKdzIqY2t9ZXS/ZTtjzXze+/Ga6p3sjrw5glhz8uxn1gne9++n4uFYiIfFdcBYBjRwZTKtVRNStZESebDhZbyXndR+8KpSaPWWLnOr+37pK5sPer1/RwMb/I4e4AzpiPeJ5pYx0SDHeSA37C784fvJ+arjOGxnIL8nolSwN2XPWuItXPc/3TqyhqfsKeKRsoZhc2kk5t6L0HCuRq9Iai11DqFZpQc4zWUDoG93ljSF7Efijow5wkpzmuvB4yRZO+37xeK01Q+2kkRBNuDe4CHB7ZF5SlxX5J6x6e81zvMWGf/jlpj5Tq/pu/zuvL0qNn3czC8ycvbpuGMsZixzj6xxDomxgY9SBRrG6crOc7NgX2cVcNjOQWpWHP2Zb8aYu2su2E3RodJnC6Q6c2kHdWzkGQb9sScRAHU47p5bFFz7VL37LWcw50g0MMba33kjG/nD+95ig6bJ6Q8bw/7iXWy9zNrNHeaXLDXU/mIWqdBxscZA0BfqINJ56iTa+Zwjg7h1H1Fuib6JMbJuYPcjuveyDVWrvNr+Z6ja1mPun0/B8ObPI4vE2NY+6SD1DHqHeJzS8/pxsXoE4yz79J5WQ/DYzkF6Xzg1Ad2DWmwdm66YbcJHYubkCZpk88581BzWHEcGmI2UCiew6Mx1HzOpjLvK52dDa6CG14787o5E461idCObwarMc5L4bVkyDwhrHmbP7HsJ9bJ3vfWiPPAi9NUPmqI8i14DkBfUgeTyj8n16hir8dQzVWDyXtiCynKtdPfu/4zw+g1JOvV+XV8z9c1q0flvp+Dvk0ex0flMYx94pDSMVonCKx+TsfGH6rjQeVsTfSN5RRw8kHCrw/5MRJr53QN+0jJf8IOAADHZa6DiTrgp2aJNeYEvt8XazR5falBxy1wFD9ZO9GwAwDAxpnnYFriU7b6P8mD7/dFDU3eURrRsRzFT9ZONOwAALBxcICvB3y/L2qIJ3KOx1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7Tz78+ePuvj9+7f49euX+Pnzp/jx44f4/v27+Pbtm/j69as4nU7iy5cv4vPnz+LTp0/i48eP4sOHD+L9+/fi3bt34u3bt+L169fi1atX4uXLl+LFixfi+fPn4tmzZ+Lp06fiyZMn4vHjx+LRo0fi4cOH4sGDB+L+/fvi3r174u7du+LOnTuqYZd6QCAQCAQCgUAgkE7wCTsAAGwcWazBOsD3+6KGeCLneBzFT9ZONOwAALBxcICvB3y/L2qIJ3KOx1H8ZO1Eww4AABsHB/h6wPf7ooZ4Iud4HMVP1k407AAAsHFwgK8HfL8vaognco7HUfxk7UTDDgAAGwcH+HrA9/uihngi53gcxU/WTjTsAACwcZY8mN5cnYmzixsx6+/gXGCNqYDv90UNTd5RGtGxHMVP1s4RDfutuDxrNn4g59cnYmxZ0LAz+XsjLqy/exbdsFBvqXAf6RBZ0tbSWur51RtztRR/xc3FceI9BcsdTPrX2c+bEkusMR3w/b6oock7SiM6lqP4ydo5smE/F9cn6ll/yTbsb650g5qoMqrpkM+tRI2ALlLumGTBKqzlYceeXTUrhOg1L26mbElMozOw2obN25KNY4miLsrXF2JSd05FT92W9HtpLfV88dMbDXtfUgfT35sLU4OMELFUMXbHZPyu5/PrGWcNBbN2UmsoCu+X7Zinzlfh+4Z2bM81FBv1/RwMbvKyevPs58aT1nG6NUpjVLzDGBv7p+1pxjE4lrPDiRV/z1g7hzfsp2txPnvDbg52axBhjS4kbgEyTmiTTc/hvmqLjz9dea0QL+mj8VqPaZN73JzhJiQ35UoUdVHFAg17X0prqeeMXAfrQh1Muv64eWdqnxNPFd9sfXSJPxDgrNGvdsZrcN4v26HncF/V74yv89v2vUH9zeuFuLpq3knu9/p8Pwf9mzyej2L7/TOhTzxjHXk+5qzBzl03j+zf7BO6rkn/WC4BJ1bcPaOxdm66Ye8Sy2yYyBL6vn7PLTABJvncxre8Voxd5ybaABK9CdCw8yjqgoZ9EKW11HNGroN1iQ8muhb4ta9nfTR1sRvOWcNeM2tntAbn/XXr/JZ9r9FrybHqeWq/V+j7Oejb5A3SO7K/TzyZOg5ag6eHf27od5Y6s/rQN5arQeyHiMwYa+fwhv32sgnwpbilng2Q/HfY0xslbkgYmyrrPP6m7JLcvOPpQW8Mu3b703oj4VK2iVL/lWPkvOavo1zx5ubO6+gYXmt78knV6mQlWMTqbudS4vlF481jbSXGtXCa4h6+zetn4unMoyVxQFG6Bbq4elC2ev6QEijO0zuYpxlPreVSnpfeD+G8dp52ffMsez/Qy9OhkTAP2zmsUDplbTGEe4kaszGig8nkV+CC1jZ7P/Zzur4pv7ljmWt0lGtntIZH+v0+drQY/cfW+a37Xue7rk05/9bo+zkY3uT10DuMX494Slg6hj7mrMHUo4u5sTl19q3M8FguTBgriswYa+fIhl0GspPLW2IcU4Y27E1q6Z/+VMNk/1xIrsQm0fA3pVso4zmJhp1a19xzx6nN0tyLdUj8ENBnXqfwetfUHAFyvLc2sflj3WOd9ZguRtqPvm4RSr9Mw97HB1n94viH+kZEulFzdu+r+YI4rOXX8rz0fqBs8OfRZO8772tdXR/KdbtrOX4KH3l7VmHsy+XeBogOptR+iPyifVCuj7Gv+GtY6FzpINbwyL3PtcOBqgktJV07Nu374FrlN5nLxBoe2/T9HAxv8rh6m3FuHHruJZaOoY85azD1sPX5RtXUQqxXZHgsFya7HwyZMdbOEf/TaSCmgZ/nX4kpbRRbRLRknWLHJg9ofjEJD3/dMNjrsECm5/Xfi687wjklPed17G6vOclEEq+t5gwm8telbIh1i0gVGkVPHzD084Zk126IntM2Woq2EvZw9e7r1/K8tG/DedU1kbPZ+8EaKX/RDPdRMEQOysd3AwxvGiXa7mx9pObrtYYkvQ8VRT8X3ufY0WLGJnO/tFbHln0f5nyyYa/U93MwvMnL6W2etT4Kal6vnOHoSPiYs0afvFJ2BGtsjOGxXJLSfpDkx1g7p2vYGzldn4uz82txIp6VZGjDrpvmLtHsdaoY6EQcU7g6wobdJr1uPHQAuiZEX5PTBpuIajw04ZySnvM6CeFuSl6zFBYlI87ilO7euokCFeoWkSo0inG+9deO46+eE01nC6Fbm4eEzrGt6/m1OK/VLTuGnkeSvV/Q3Wc6H0VzKEnl1jYY2jRy66PnJwtzjQ46VyzkGh7p97l2WNRa2ZjmdXXZrO/VGL8uqTUIH9fq+zkY3uTx9bY+as/VnnuppCPpY3bOlPVo88XcXytWJYbHcjnK+6E8xto5fcM+8Hvtgxp2k0xRs6mSMt4EHMcl1yLQm5IomGqNoLlObExFUHiVnvTAuGHvO69TtO31m6Ag0xi/eEU/9hWlu7duQt9Qt4hUoZGM9K2/trGpma8TP8YRGd3U3MEc5Hor+bU4L6GLJJyXmkeSvV/QvWNeH9VAdDClbHFz0Ywp18fED7ycNTzoXNFkfqhuSbzPtkOjYp6qFS05XX226nttZ1q6tev1/RwMb/L66G3G2rrPyRmHnI5JH/fIy5IeXt1MxHoLDI/lMnD2A2eMtXPShv32sll4yU/YU4lEJKV2Cifp+JuSatiblVRxPLu6Uv8NiyY1r7c57DW5vp7DL549501cqz/nkobc6LGvKN39dSkbwjEEicKmGedbb4yyM795IrK6NQS+i9dbz6/leYePkWTvF3RvmdlHNRAfTLQtXk1i1ke6jkkYa3ika2f6HZfE+yvX+Tp8r1HPg/1es+/nYHiT10dvM7ZQe1KxSemY9zFnDZ4eft206/Y8FxdgeCznh7MfuHvG2jm4Yb+99D9J15+uD/8fT4d9JUYnX1igtBO65NPJyCkkEv6mTBZCU+SkuBuD1IMoiEp/cv3cZmPOG27C9trYTdmjML52JrTJFt0LdCfXDYuDnCeIo4eyJ10wxvjW18/YGUof3eS1v6j3nFzPGd/6I7yX1du+18+vnHm1bwP9g3mpeSTZ+5k1dD7a6+l8FK/TIOOT8dEWoA4m7QPHFtNEdfXB+C2wTb9n8yRf78pruKTm4tbU1DiOHVI1ogYk4epUi+81ygfemnX7fg6GN3k5H3W+UJizx41Vn3hSOnJ8zFmDPYbKo9DOlRkey3nhxKrPnrF2jmjYZfBcGfdPPFINuzUoFjdpTEHp9dxKl7S8tXz0O/RzvSmIYJiN3ImzcQzqXTKK2hayYHPndTZhvCmNr1IZZDZ2u0YzLtSV0j1ex25+LdIe5cvgUPCI7OvebRnoW18/rZs/xOib0k+t66+l5nR0ceeL/LGiX6eal5pHkr0f6BXuQS+2k/koXifnn62QOpiUfY4tns8Uhfpo/EqEqKW0RuTPVsw6hTWK7yvWq/Nb9n2Iss/N58p9Pwd9mzyO3tQYyufceMY68nws4axRGqOeh3XR1uBcwi5M31guAydW/HhKrJ2TfiVmjOQ/YQdgGXThjQ+Q1H0AlmCug2mJvK5978D3+2KbTZ5PDTpugaP4ydqJhh0AB33AhZ+OmJ+GN/TJAjgW8xxMOq9Ln9qOY4k15gW+3xc1NHlHaUTHchQ/WTvRsAMQYJt2V3DogTXBAb4e8P2+qCGeyDkeR/GTtRMNOwAAbBwc4OsB3++LGuKJnONxFD9ZO9GwAwDAxsEBvh7w/b6oIZ7IOR5H8ZO18+zPnz/q4vfv3+LXr1/i58+f4sePH+L79+/i27dv4uvXr+J0OokvX76Iz58/i0+fPomPHz+KDx8+iPfv34t3796Jt2/fitevX4tXr16Jly9fihcvXojnz5+LZ8+eiadPn4onT56Ix48fi0ePHomHDx+KBw8eiPv374t79+6Ju3fvijt37qiGXeoBgUAgEAgEAoFAOsEn7AAAsHFksQbrAN/vixriiZzjcRQ/WTvRsAMAwMbBAb4e8P2+qCGeyDkeR/GTtlOI/wef2R2IgR60PgAAAABJRU5ErkJggg=="
+    }
+   },
+   "cell_type": "markdown",
+   "id": "35d58b6f",
+   "metadata": {},
+   "source": [
+    "For the first three questions, you do not have to define any of your own functions. Use the `project` module by calling the specific function needed to solve a certain question.\n",
+    "\n",
+    "*Please Note*, indexing in python starts from **0**. Therefore, if a question asks you to use a certain value's **index**, do not be confused that with the **location** of the value in the dataset. In our dataset here,\n",
+    "\n",
+    "![table.PNG](attachment:table.PNG)\n",
+    "\n",
+    "the **index** for `1804 New England Hurricane` is 0, but the **location** is 1, and the **row number** is 2. Be sure to keep this concept in mind for *all* questions asking for the value at a particular **index**."
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "56da8b79",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 1:** How **many** hurricanes does the dataset have?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "c3589ac0",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.399077Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.399077Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.406296Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.406296Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'num_hurricanes'\n",
+    "\n",
+    "# display the variable 'num_hurricanes' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "7232d907",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q1\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "9a4bf2a5",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 2:** How many `deaths` were caused by the hurricane at index *315*?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "968c7d78",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.434208Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.434208Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.438777Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.438777Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'deaths_315'\n",
+    "\n",
+    "# display the variable 'deaths_315' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "93dd1933",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q2\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "5cc2dce4",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 3:** What is the `name` of the hurricane at the **end** of the dataset?\n",
+    "\n",
+    "**Hint**: Your code should work even if the number of hurricanes in the dataset were to change. You **must not hardcode** the index of the last hurricane."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4a307eb0",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.463930Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.462947Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.468342Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.468342Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'name_last_index'\n",
+    "\n",
+    "# display the variable 'name_last_index' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3d45da9f",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q3\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "8a769290",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 4:** How **many** hurricanes in the dataset did **not** cause any `deaths`?\n",
+    "\n",
+    "**Hint:** Loop through *all* hurricanes and count the hurricanes that has *0* `deaths`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "75341871",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.498877Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.497879Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.505568Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.505568Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'zero_death_hurrs'\n",
+    "\n",
+    "# display the variable 'zero_death_hurrs' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "bc763a6e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q4\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "0c3a4374",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 5:** What is the **fastest** speed (in `mph`) of a hurricane in the dataset?\n",
+    "\n",
+    "**Hint**: Look at Question 26 and Question 27 in Lab-P5 on finding the maximum/minimum. Here you will have to find the function value of the function `project.get_mph`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "0b7a9fdc",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.544318Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.544318Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.550279Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.550279Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'max_speed'\n",
+    "\n",
+    "# display the variable 'max_speed' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "05259535",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q5\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "ae0a2c35",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Function 1: `format_damage(damage)`\n",
+    "\n",
+    "You will notice if you look at the dataset that the damages caused by the hurricanes are not stored directly as numbers. Instead the damages have a suffix (`\"K\"`, `\"M\"`, or `\"B\"`) attached at the very end. You will have to convert these 'numbers' into integers before you can perform any mathematical operations on them. \n",
+    "\n",
+    "Since you will need to format damages for multiple hurricanes, you **must** create a general helper function that handles the `\"K\"`, `\"M\"`, and `\"B\"` suffixes. Remember that `\"K\"` stands for thousand, `\"M\"` stands for million, and `\"B\"` stands for billion. For example, your function should convert the string `\"13.5M\"` to `13500000`, `\"6.9K\"` to `6900` and so on. Note that for **some** hurricanes, the `damage` does **not** have **any** suffixes. For instance, the hurricane `Florence` at index `308` did damage `'0'`. Your function **must** also deal with such inputs, by directly typecasting them to ints. \n",
+    "\n",
+    "This function should take in the strings from the `damage` column as input, and return an **int**. Refer to Task 3.2 in Lab-P5 to understand how to slice and calculate damage.\n",
+    "\n",
+    "**Warning:** Your function `format_damage` must take in the damage as a **string**, and **not** an index. If you code your function to take in the index of a hurricane, and return the damage caused as an int, it will be useful only for this project. To make your function more useful, you must make it accept the damage itself (i.e., a string like `\"13.5M\"` or `\"6.9K\"`) as input."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "20856f22",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.568739Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.568739Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.574107Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.574107Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "def format_damage(damage):\n",
+    "    pass # TODO: replace this with your code\n",
+    "    #TODO: use relevant intermediary variables to simplify your code\n",
+    "    #TODO: check the last character of the string `damage`\n",
+    "    #TODO: type cast the string (except for last character - use appropriate slicing) into a float\n",
+    "    #TODO: use the last character of string to determine what factor to multiply the float with\n",
+    "    #TODO: type cast the final computation to int\n",
+    "    "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "b569c767",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"format_damage\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "47b77011",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 6:** What is the `damage` (in dollars) caused by the hurricane named *Igor*?\n",
+    "\n",
+    "There is **exactly one** hurricane in this dataset named *Igor*. You **must** exit the loop, and **stop** iterating as soon as you find the hurricane named *Igor*.\n",
+    "\n",
+    "You **must** use the `format_damage` function to answer this question. Your answer **must** be an `int`. "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "dfecaff8",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.607286Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.607286Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.612633Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.612633Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'damage_igor'\n",
+    "\n",
+    "# display the variable 'damage_igor' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "feb95bd6",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q6\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "a44ffa98",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 7:** What is the **total** `damage` (in dollars) caused by all hurricanes named *Karen* in the dataset? \n",
+    "\n",
+    "There are **multiple** hurricanes in this dataset named *Karen*. You must add up the damages caused by all of them. You **must** use the `format_damage` function to answer this question."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "69cb01e1",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.645024Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.644024Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.650341Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.650341Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'total_damage_karen'\n",
+    "\n",
+    "# display the variable 'total_damage_karen' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "3a5f8759",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q7\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "28377e2a",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 8:** What is the **average** `damage` caused by hurricanes with names starting with the letter *G*?\n",
+    "\n",
+    "You should only consider hurricanes whose **first character** is `\"G\"`. Remember to search for `\"G\"` and not `\"g\"`. "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "51d0de97",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.681689Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.680689Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.687097Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.687097Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'average_damage_starts_g'\n",
+    "# use relevant intermediary variables to simplify your code\n",
+    "\n",
+    "# display the variable 'average_damage_starts_g' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "5d72b561",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q8\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "c1a6cd53",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 9:** What is the `name` of the **fastest** hurricane in the dataset?\n",
+    "\n",
+    "To break ties (if there are multiple hurricanes with the same speed), you **must** consider the **last** one you find. \n",
+    "\n",
+    "**Hint:** If you find the **index** of the fastest hurricane in Question 9 instead of just the **name** of the hurricane, you can solve Question 10 very easily using the appropriate function from the project module (i.e., without writing a new loop)."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "cee13ee3",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.719357Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.718357Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.724751Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.724751Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'fastest_hurricane'\n",
+    "\n",
+    "# display the variable 'fastest_hurricane' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "ca24b99f",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q9\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e2e9d016",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 10:** What is the `damage` (in dollars) caused by the **fastest** hurricane (found in Question 9)?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "d24531e5",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.762831Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.761830Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.766867Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.766867Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'fastest_hurricane_damage'\n",
+    "\n",
+    "# display the variable 'fastest_hurricane_damage' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "29b54b84",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q10\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4a17a9ea",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Functions 2, 3, 4: `get_year(date)`, `get_month(date)`, and `get_day(date)`\n",
+    "\n",
+    "Now would be a good time to copy the `get_year`, `get_month`, and `get_day` functions you created in Lab-P5 to your project notebook. You will need these functions for the upcoming questions."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "647677f0",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.804133Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.804133Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.808029Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.808029Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# copy/paste the get_year function here from your lab-p5 practice notebook\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "5960c6fd",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"get_year\")"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "73580d08",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.818916Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.817916Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.821709Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.821709Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# copy/paste the get_month function here from your lab-p5 practice notebook\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "81082f1e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"get_month\")"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a2295cf0",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.832729Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.831728Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.835599Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.835599Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# copy/paste the get_day function here from your lab-p5 practice notebook\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "fe88c1a0",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"get_day\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "b939917b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 11:** What is the `name` of the **earliest** hurricane which caused over *1 billion* dollars in `damages`?\n",
+    "\n",
+    "You **must** use the `year` of formation of the hurricane to identify the earliest hurricane. There are **no** other hurricanes in that year which caused over 1 billion dollars in damages, so you do not have to worry about breaking ties.\n",
+    "\n",
+    "You need to find the hurricane with the earliest year of formation among those hurricanes with more than 1 billion dollars in damages. You **must not** initialize your variable to be some hurricane which caused less than 1 billion dollars in damages, such as the hurricane at index `0` for example. If you do so, you will find that you are finding the hurricane with the earliest year of formation among the hurricanes with **either** more than 1 billion dollars in damages **or** have index `0`. This is **not** what you are supposed to do.\n",
+    "\n",
+    "**Hint:** Take a look at the [lecture notes for February 20](???) if you do not remember how to find the maximum/minimum with `None` initialization. You can use `continue` statement to skip to next index in a loop. "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "b6d49c68",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.846631Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.845631Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.852614Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.852614Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'earliest_billion_dollar_hurr'\n",
+    "\n",
+    "# display the variable 'earliest_billion_dollar_hurr' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "e1e7c74f",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q11\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "26a2c264",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 12:** What is the `name` of the **most recent** hurricane which caused over *100 billion* dollars in `damages`?\n",
+    "\n",
+    "You **must** use the `year` of formation of the hurricane to identify the most recent hurricane. There are **no** other hurricanes in that year which caused over 100 billion dollars in damages, so you do not have to worry about breaking ties. You **must not** only use the indices of the hurricanes to determine the most recent hurricane (i.e., you may **not** take for granted that the hurricanes are sorted in increasing order of the date of formation)."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "352bc7cb",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.903886Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.902886Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.910062Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.910062Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'most_recent_100_billion_hurr'\n",
+    "\n",
+    "# display the variable 'most_recent_100_billion_hurr' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "42976c66",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q12\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e52967f3",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Function 5: `deadliest_in_range(year1, year2)`\n",
+    "\n",
+    "This function should take in two years, `year1` and `year2` as its inputs and return the **index** of the hurricane which formed **or** dissipated between `year1` and `year2` and caused the **most** `deaths`. In case of any ties, you must return the index of the **first** hurricane in the dataset with the most deaths.\n",
+    "\n",
+    "As in Question 11 and Question 12, you **must** initialize the variable you use to store the index of the deadliest hurricane as `None`, and update it for the first time only when you come across the first hurricane in the dataset within the year range."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "1f96e06f",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:42.962804Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.961804Z",
+     "iopub.status.idle": "2023-10-04T01:14:42.968436Z",
+     "shell.execute_reply": "2023-10-04T01:14:42.968436Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "def deadliest_in_range(year1, year2):\n",
+    "    \"\"\"\n",
+    "    deadliest_in_range(year1, year2) gets the index of the deadliest (most deaths) hurricane \n",
+    "    formed or dissipated within the given year range.\n",
+    "    year1 and year2 are inclusive bounds.\n",
+    "\n",
+    "    deadliest_in_range(year1, year2) returns the index of the worst hurricane within the year range.\n",
+    "    \"\"\"\n",
+    "    pass # TODO: replace with your code"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "f2bd8df5",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"deadliest_in_range\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "dd01ce85",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 13:** How much `damage` (in dollars) was done by the **deadliest** hurricane this century thus far (*2001 to 2023*, both inclusive)?\n",
+    "\n",
+    "Your answer **must** be an `int`. "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a9cb1f16",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.000280Z",
+     "iopub.status.busy": "2023-10-04T01:14:42.999280Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.005400Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.005400Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'damage_by_deadliest_21st_century'\n",
+    "\n",
+    "# display the variable 'damage_by_deadliest_21st_century' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "8b1fb468",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q13\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "d47f482e",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 14:** What was the speed (in `mph`) of the **deadliest** hurricane of the 20th century (*1901 to 2000*, both inclusive)?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "087083c8",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.029723Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.029723Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.037046Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.037046Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'speed_of_deadliest_20th_century'\n",
+    "\n",
+    "# display the variable 'speed_of_deadliest_20th_century' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "fe726c3b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q14\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "68375727",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 15:** In this century (*2001 to 2022*, both inclusive) how many hurricanes formed on **average**, in the `month` of *October*?\n",
+    "\n",
+    "We will leave out the year *2023* since *October* isn't yet over. Your answer must be a  **float**. You may hardcode the month (i.e., **10**) and the range of years (i.e., **2001** and **2022**) for the average calculation."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "956329f4",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.065283Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.065283Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.072482Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.072482Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'avg_hurricanes_in_oct'\n",
+    "\n",
+    "# display the variable 'avg_hurricanes_in_oct' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "e164ced1",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q15\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "2ad428f0",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "### Function 6: `get_year_total(year)`\n",
+    "\n",
+    "This function should take in `year` as its input and return the number of hurricanes that were **formed** in the given `year`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4e941295",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.113157Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.112157Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.116175Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.116175Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# define the function `get_year_total` here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "6eeb1f36",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"get_year_total\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "39dd6c24",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 16:** How **many** hurricanes were formed in the `year` *2016*?\n",
+    "\n",
+    "You **must** answer this question by calling `get_year_total`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "72093103",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.147988Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.146987Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.152623Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.152623Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'total_hurricanes_2016'\n",
+    "\n",
+    "# display the variable 'total_hurricanes_2016' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "67b6112c",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q16\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "ac106398",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 17:** How **many** hurricanes were formed in the last `decade` (*2011 to 2020*, both inclusive)?\n",
+    "\n",
+    "You **must** answer this question by **looping** across the years in this decade, and calling the function `get_year_total`."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "e23d83a8",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.177328Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.176328Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.185658Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.185658Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'total_hurricanes_in_last_decade'\n",
+    "\n",
+    "# display the variable 'total_hurricanes_in_last_decade' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "0b8dfd7b",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q17\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e8b1e410",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 18:** Which `year` in the 20th century (*1901 to 2000*, both inclusive) suffered the **most** number of hurricanes?\n",
+    "\n",
+    "You **must** answer this question by calling the function `get_year_total`. You **must** break ties in favor of the most recent year."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "1ed008cf",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.219103Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.218103Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.295977Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.294972Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'year_with_most_hurricanes'\n",
+    "\n",
+    "# display the variable 'year_with_most_hurricanes' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "698ce095",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q18\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "b0549041",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 19:** How **many** hurricanes lasted across at least 2 *different* `months`?\n",
+    "\n",
+    "**Hint:** You can determine if a hurricane lasted across two different months by comparing the month of formation and the month of dissipation of the hurricane. Note that there may be hurricanes which formed late in the year, and dissipated early in the next year. You may make the assumption that **no** hurricane formed in one month, lasted years, and then dissipated in the same month of a different year."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "73ecdd09",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.336127Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.336127Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.342379Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.342379Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'multiple_months_hurrs'\n",
+    "\n",
+    "# display the variable 'multiple_months_hurrs' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "8b4eabc5",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q19\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "87abb7b0",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "**Question 20:** What is the **average** `damage` caused by the **deadliest** hurricane of each year from *2001 - 2023*, both inclusive?\n",
+    "\n",
+    "You **must** use the `deadliest_in_range` function to identify the deadliest hurricane of each year, and you **must** use `format_damage` to convert the `damages` into an `int`. If two hurricanes in a year have the **same** deaths, you must break ties in favor of the hurricane that appears **first** in the dataset.\n",
+    "\n",
+    "**Hint:** For calculating average only consider the years that had a deadliest hurricane. If a particular year has no hurricanes in it (which would imply that it has no deadliest hurricane), you should skip that year from both the numerator and the denominator.\n",
+    "\n",
+    "Your answer **must** be a  **float**."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "847659ae",
+   "metadata": {
+    "execution": {
+     "iopub.execute_input": "2023-10-04T01:14:43.381867Z",
+     "iopub.status.busy": "2023-10-04T01:14:43.381867Z",
+     "iopub.status.idle": "2023-10-04T01:14:43.404977Z",
+     "shell.execute_reply": "2023-10-04T01:14:43.404977Z"
+    },
+    "tags": []
+   },
+   "outputs": [],
+   "source": [
+    "# compute and store the answer in the variable 'average_damage_deadliest'\n",
+    "\n",
+    "# display the variable 'average_damage_deadliest' here"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "bbb1c542",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"q20\")"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "71144f03",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"general_deductions\")"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "aef41b3c",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "grader.check(\"summary\")"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "383310d5",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    "## Submission\n",
+    "It is recommended that at this stage, you Restart and Run all Cells in your notebook.\n",
+    "That will automatically save your work and generate a zip file for you to submit.\n",
+    "\n",
+    "**SUBMISSION INSTRUCTIONS**:\n",
+    "1. **Upload** the zipfile to Gradescope.\n",
+    "2. If you completed the project with a **partner**, make sure to **add their name** by clicking \"Add Group Member\"\n",
+    "in Gradescope when uploading the zip file.\n",
+    "3. Check **Gradescope** results as soon as the auto-grader execution gets completed.\n",
+    "4. Your **final score** for this project is the score that you see on **Gradescope**.\n",
+    "5. You are **allowed** to resubmit on Gradescope as many times as you want to.\n",
+    "6. **Contact** a TA/PM if you lose any points on Gradescope for any **unclear reasons**."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "97c08385",
+   "metadata": {
+    "cell_type": "code",
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "# running this cell will create a new save checkpoint for your notebook\n",
+    "from IPython.display import display, Javascript\n",
+    "display(Javascript('IPython.notebook.save_checkpoint();'))"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "400ff620",
+   "metadata": {
+    "cell_type": "code",
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "!jupytext --to py p5.ipynb"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "188213f6",
+   "metadata": {
+    "cell_type": "code",
+    "deletable": false,
+    "editable": false
+   },
+   "outputs": [],
+   "source": [
+    "public_tests.check_file_size(\"p5.ipynb\")\n",
+    "grader.export(pdf=False, run_tests=False, files=[\"p5.py\"])"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "2ae54425",
+   "metadata": {
+    "deletable": false,
+    "editable": false
+   },
+   "source": [
+    " "
+   ]
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.11.4"
+  },
+  "otter": {
+   "OK_FORMAT": true,
+   "tests": {
+    "deadliest_in_range": {
+     "name": "deadliest_in_range",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('deadliest_in_range: variable to store the index of the deadliest hurricane is not initialized as `None`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'variable to store the index of the deadliest hurricane is not initialized as `None` (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('deadliest_in_range: function does not consider all hurricanes active between `year1` and `year2`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function does not consider all hurricanes active between `year1` and `year2` (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('deadliest_in_range: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('deadliest_in_range: function logic is incorrect')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function logic is incorrect (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "format_damage": {
+     "name": "format_damage",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('format_damage: function output is incorrect when the damage has suffix `K`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function output is incorrect when the damage has suffix `K` (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('format_damage: function output is incorrect when the damage has suffix `M`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function output is incorrect when the damage has suffix `M` (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('format_damage: function output is incorrect when the damage has suffix `B`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function output is incorrect when the damage has suffix `B` (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('format_damage: function output is incorrect when the damage has no suffix')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function output is incorrect when the damage has no suffix (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "general_deductions": {
+     "name": "general_deductions",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('general_deductions: Did not save the notebook file prior to running the cell containing \"export\". We cannot see your output if you do not save before generating the zip file. This deduction will become stricter for future projects.')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'Did not save the notebook file prior to running the cell containing \"export\". We cannot see your output if you do not save before generating the zip file. This deduction will become stricter for future projects. (-3)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('general_deductions: Functions are defined more than once.')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'Functions are defined more than once. (-3)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('general_deductions: Import statements are not all placed at the top of the notebook.')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'Import statements are not all placed at the top of the notebook. (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('general_deductions: Used concepts or modules not covered in class yet.')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'Used concepts or modules not covered in class yet. (-5)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "get_day": {
+     "name": "get_day",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('get_day: function logic is incorrect')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function logic is incorrect (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "get_month": {
+     "name": "get_month",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('get_month: function logic is incorrect')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function logic is incorrect (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "get_year": {
+     "name": "get_year",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('get_year: function logic is incorrect')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function logic is incorrect (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "get_year_total": {
+     "name": "get_year_total",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('get_year_total: function logic is incorrect')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function logic is incorrect (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('get_year_total: function `get_year` is not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function `get_year` is not used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('get_year_total: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('get_year_total: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q1": {
+     "name": "q1",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q1', num_hurricanes)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q1: required function is not used', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'required function is not used (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q1: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q10": {
+     "name": "q10",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q10', fastest_hurricane_damage)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q10: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q10: tie breaking is not implemented correctly')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'tie breaking is not implemented correctly (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q10: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q10: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q11": {
+     "name": "q11",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q11', earliest_billion_dollar_hurr)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q11: variable to store the index or name of the earliest hurricane is not initialized as `None`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'variable to store the index or name of the earliest hurricane is not initialized as `None` (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q11: `get_year` function is not used to determine the year of formation', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - '`get_year` function is not used to determine the year of formation (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q11: used indices of the hurricanes to determine the earliest hurricane')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'used indices of the hurricanes to determine the earliest hurricane (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q11: hurricanes with damages <= 1B are not ignored')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'hurricanes with damages <= 1B are not ignored (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q11: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q11: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q12": {
+     "name": "q12",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q12', most_recent_100_billion_hurr)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q12: variable to store the index or name of the most recent hurricane is not initialized as `None`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'variable to store the index or name of the most recent hurricane is not initialized as `None` (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q12: `get_year` function is not used to determine the year of formation', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - '`get_year` function is not used to determine the year of formation (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q12: used indices of the hurricanes to determine the most recent hurricane')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'used indices of the hurricanes to determine the most recent hurricane (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q12: hurricanes with damages <= 100B are not ignored')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'hurricanes with damages <= 100B are not ignored (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q12: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q12: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q13": {
+     "name": "q13",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q13', damage_by_deadliest_21st_century)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q13: functions `deadliest_in_range` and `format_damage` are not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'functions `deadliest_in_range` and `format_damage` are not used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q13: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q14": {
+     "name": "q14",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q14', speed_of_deadliest_20th_century)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q14: function `deadliest_in_range` is not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function `deadliest_in_range` is not used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q14: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q15": {
+     "name": "q15",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q15', avg_hurricanes_in_oct)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q15: functions `get_year` and `get_month` are not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'functions `get_year` and `get_month` are not used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q15: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q15: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q15: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q16": {
+     "name": "q16",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q16', total_hurricanes_2016)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q16: function `get_year_total` is not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function `get_year_total` is not used to answer (-3)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q16: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q17": {
+     "name": "q17",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q17', total_hurricanes_in_last_decade)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q17: function `get_year_total` is not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function `get_year_total` is not used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q17: did not loop through the years in the last decade and hardcoded all ten years')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'did not loop through the years in the last decade and hardcoded all ten years (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q17: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q18": {
+     "name": "q18",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q18', year_with_most_hurricanes)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q18: `year_with_most_hurricanes` is not initialized as some year in the twentieth century, or as `None`')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - '`year_with_most_hurricanes` is not initialized as some year in the twentieth century, or as `None` (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q18: function `get_year_total` is not used to determine the year with the most hurricanes', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function `get_year_total` is not used to determine the year with the most hurricanes (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q18: tie breaking is not implemented correctly')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'tie breaking is not implemented correctly (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q18: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q19": {
+     "name": "q19",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q19', multiple_months_hurrs)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q19: hurricanes that formed at the end of one year and dissipated at the end of the next are not considered')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'hurricanes that formed at the end of one year and dissipated at the end of the next are not considered (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q19: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q19: function `get_month` is not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'function `get_month` is not used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q19: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q2": {
+     "name": "q2",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q2', deaths_315)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q2: required function is not used', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'required function is not used (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q2: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q20": {
+     "name": "q20",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q20', average_damage_deadliest)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q20: years with no deadliest hurricane are not ignored')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'years with no deadliest hurricane are not ignored (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q20: all hurricanes formed between 2001 and 2023 are not considered')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'all hurricanes formed between 2001 and 2023 are not considered (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q20: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q20: functions `deadliest_in_range` and `format_damage` are not used to answer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'functions `deadliest_in_range` and `format_damage` are not used to answer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q20: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q3": {
+     "name": "q3",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q3', name_last_index)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q3: index of the last hurricane is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'index of the last hurricane is hardcoded (-3)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q3: required function is not used', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'required function is not used (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q4": {
+     "name": "q4",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q4', zero_death_hurrs)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q4: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q4: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q4: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q5": {
+     "name": "q5",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q5', max_speed)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q5: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q6": {
+     "name": "q6",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q6', damage_igor)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q6: did not exit loop and instead iterated further after finding the hurricane')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'did not exit loop and instead iterated further after finding the hurricane (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q6: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q6: `format_damage` function is not used to convert the damages into an integer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - '`format_damage` function is not used to convert the damages into an integer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q7": {
+     "name": "q7",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q7', total_damage_karen)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q7: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q7: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q7: `format_damage` function is not used to convert the damages into an integer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - '`format_damage` function is not used to convert the damages into an integer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q8": {
+     "name": "q8",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q8', average_damage_starts_g)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q8: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q8: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q8: `format_damage` function is not used to convert the damages into an integer', False)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - '`format_damage` function is not used to convert the damages into an integer (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "q9": {
+     "name": "q9",
+     "points": 0,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.check('q9', fastest_hurricane)\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q9: incorrect logic is used to answer')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'incorrect logic is used to answer (-2)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q9: tie breaking is not implemented correctly')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'tie breaking is not implemented correctly (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> \n>>> public_tests.rubric_check('q9: number of hurricanes in the dataset is hardcoded')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false,
+         "success_message": "Note that the Gradescope autograder will deduct points if your code fails the following rubric point - 'number of hurricanes in the dataset is hardcoded (-1)'. The public tests cannot determine if your code satisfies these requirements. Verify your code manually."
+        },
+        {
+         "code": ">>> public_tests.rubric_check('q9: public tests')\nAll test cases passed!\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    },
+    "summary": {
+     "name": "summary",
+     "points": 127,
+     "suites": [
+      {
+       "cases": [
+        {
+         "code": ">>> public_tests.get_summary()\nTotal Score: 100/100\n",
+         "hidden": false,
+         "locked": false
+        }
+       ],
+       "scored": true,
+       "setup": "",
+       "teardown": "",
+       "type": "doctest"
+      }
+     ]
+    }
+   }
+  },
+  "vscode": {
+   "interpreter": {
+    "hash": "aee8b7b246df8f9039afb4144a1f6fd8d2ca17a180786b69acc140d282b71a49"
+   }
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/p5/project.py b/p5/project.py
new file mode 100644
index 0000000000000000000000000000000000000000..118e11b490d2d265e67c372322faf66c1345d358
--- /dev/null
+++ b/p5/project.py
@@ -0,0 +1,49 @@
+__hurricane__ = []
+
+
+def __init__():
+    import csv
+    """This function will read in the csv_file and store it in a list of dictionaries"""
+    __hurricane__.clear()
+    with open('hurricanes.csv', mode='r') as csv_file:
+        csv_reader = csv.DictReader(csv_file)
+        for row in csv_reader:
+            __hurricane__.append(row)
+
+
+def count():
+    """This function will return the number of records in the dataset"""
+    return len(__hurricane__)
+
+
+def get_name(idx):
+    """get_name(idx) returns the name of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['name']
+
+
+def get_formed(idx):
+    """get_formed(idx) returns the date of formation of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['formed']
+
+
+def get_dissipated(idx):
+    """get_dissipated(idx) returns the date of dissipation of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['dissipated']
+
+
+def get_mph(idx):
+    """get_mph(idx) returns the mph of the hurricane in row idx"""
+    return int(__hurricane__[int(idx)]['mph'])
+
+
+def get_damage(idx):
+    """get_damage(idx) returns the damage in dollars of the hurricane in row idx"""
+    return __hurricane__[int(idx)]['damage']
+
+
+def get_deaths(idx):
+    """get_deaths(idx) returns the deaths of the hurricane in row idx"""
+    return int(__hurricane__[int(idx)]['deaths'])
+
+
+__init__()
diff --git a/p5/public_tests.py b/p5/public_tests.py
new file mode 100644
index 0000000000000000000000000000000000000000..35a00ce4fff657aa1ac7fe4274c2bb8d42ac0f83
--- /dev/null
+++ b/p5/public_tests.py
@@ -0,0 +1,808 @@
+#!/usr/bin/python
+# +
+import os, json, math, copy
+from collections import namedtuple
+from bs4 import BeautifulSoup
+
+HIDDEN_FILE = os.path.join("hidden", "hidden_tests.py")
+if os.path.exists(HIDDEN_FILE):
+    import hidden.hidden_tests as hidn
+# -
+
+MAX_FILE_SIZE = 750 # units - KB
+REL_TOL = 6e-04  # relative tolerance for floats
+ABS_TOL = 15e-03  # absolute tolerance for floats
+TOTAL_SCORE = 100 # total score for the project
+
+DF_FILE = 'expected_dfs.html'
+PLOT_FILE = 'expected_plots.json'
+
+PASS = "All test cases passed!"
+
+TEXT_FORMAT = "TEXT_FORMAT"  # question type when expected answer is a type, str, int, float, or bool
+TEXT_FORMAT_UNORDERED_LIST = "TEXT_FORMAT_UNORDERED_LIST"  # question type when the expected answer is a list or a set where the order does *not* matter
+TEXT_FORMAT_ORDERED_LIST = "TEXT_FORMAT_ORDERED_LIST"  # question type when the expected answer is a list or tuple where the order does matter
+TEXT_FORMAT_DICT = "TEXT_FORMAT_DICT"  # question type when the expected answer is a dictionary
+TEXT_FORMAT_SPECIAL_ORDERED_LIST = "TEXT_FORMAT_SPECIAL_ORDERED_LIST"  # question type when the expected answer is a list where order does matter, but with possible ties. Elements are ordered according to values in special_ordered_json (with ties allowed)
+TEXT_FORMAT_NAMEDTUPLE = "TEXT_FORMAT_NAMEDTUPLE"  # question type when expected answer is a namedtuple
+PNG_FORMAT_SCATTER = "PNG_FORMAT_SCATTER" # question type when the expected answer is a scatter plot
+HTML_FORMAT = "HTML_FORMAT" # question type when the expected answer is a DataFrame
+FILE_JSON_FORMAT = "FILE_JSON_FORMAT" # question type when the expected answer is a JSON file
+SLASHES = " SLASHES" # question SUFFIX when expected answer contains paths with slashes
+
+def get_expected_format():
+    """get_expected_format() returns a dict mapping each question to the format
+    of the expected answer."""
+    expected_format = {'q1': 'TEXT_FORMAT',
+                       'q2': 'TEXT_FORMAT',
+                       'q3': 'TEXT_FORMAT',
+                       'q4': 'TEXT_FORMAT',
+                       'q5': 'TEXT_FORMAT',
+                       'q6': 'TEXT_FORMAT',
+                       'q7': 'TEXT_FORMAT',
+                       'q8': 'TEXT_FORMAT',
+                       'q9': 'TEXT_FORMAT',
+                       'q10': 'TEXT_FORMAT',
+                       'q11': 'TEXT_FORMAT',
+                       'q12': 'TEXT_FORMAT',
+                       'q13': 'TEXT_FORMAT',
+                       'q14': 'TEXT_FORMAT',
+                       'q15': 'TEXT_FORMAT',
+                       'q16': 'TEXT_FORMAT',
+                       'q17': 'TEXT_FORMAT',
+                       'q18': 'TEXT_FORMAT',
+                       'q19': 'TEXT_FORMAT',
+                       'q20': 'TEXT_FORMAT'}
+    return expected_format
+
+
+def get_expected_json():
+    """get_expected_json() returns a dict mapping each question to the expected
+    answer (if the format permits it)."""
+    expected_json = {'q1': 554,
+                     'q2': 3,
+                     'q3': 'Nicole',
+                     'q4': 91,
+                     'q5': 190,
+                     'q6': 200000000,
+                     'q7': 4948000,
+                     'q8': 703819205.882353,
+                     'q9': 'Allen',
+                     'q10': 1570000000,
+                     'q11': '1900 Galveston hurricane',
+                     'q12': 'Ian',
+                     'q13': 91600000000,
+                     'q14': 155,
+                     'q15': 1.8181818181818181,
+                     'q16': 8,
+                     'q17': 97,
+                     'q18': 1995,
+                     'q19': 153,
+                     'q20': 20052745454.545456}
+    return expected_json
+
+
+def get_special_json():
+    """get_special_json() returns a dict mapping each question to the expected
+    answer stored in a special format as a list of tuples. Each tuple contains
+    the element expected in the list, and its corresponding value. Any two
+    elements with the same value can appear in any order in the actual list,
+    but if two elements have different values, then they must appear in the
+    same order as in the expected list of tuples."""
+    special_json = {}
+    return special_json
+
+
+def compare(expected, actual, q_format=TEXT_FORMAT):
+    """compare(expected, actual) is used to compare when the format of
+    the expected answer is known for certain."""
+    try:
+        if q_format == TEXT_FORMAT:
+            return simple_compare(expected, actual)
+        elif q_format == TEXT_FORMAT_UNORDERED_LIST:
+            return list_compare_unordered(expected, actual)
+        elif q_format == TEXT_FORMAT_ORDERED_LIST:
+            return list_compare_ordered(expected, actual)
+        elif q_format == TEXT_FORMAT_DICT:
+            return dict_compare(expected, actual)
+        elif q_format == TEXT_FORMAT_SPECIAL_ORDERED_LIST:
+            return list_compare_special(expected, actual)
+        elif q_format == TEXT_FORMAT_NAMEDTUPLE:
+            return namedtuple_compare(expected, actual)
+        elif q_format == PNG_FORMAT_SCATTER:
+            return compare_flip_dicts(expected, actual)
+        elif q_format == HTML_FORMAT:
+            return compare_cell_html(expected, actual)
+        elif q_format == FILE_JSON_FORMAT:
+            return compare_json(expected, actual)
+        else:
+            if expected != actual:
+                return "expected %s but found %s " % (repr(expected), repr(actual))
+    except:
+        if expected != actual:
+            return "expected %s" % (repr(expected))
+    return PASS
+
+
+def print_message(expected, actual, complete_msg=True):
+    """print_message(expected, actual) displays a simple error message."""
+    msg = "expected %s" % (repr(expected))
+    if complete_msg:
+        msg = msg + " but found %s" % (repr(actual))
+    return msg
+
+
+def simple_compare(expected, actual, complete_msg=True):
+    """simple_compare(expected, actual) is used to compare when the expected answer
+    is a type/Nones/str/int/float/bool. When the expected answer is a float,
+    the actual answer is allowed to be within the tolerance limit. Otherwise,
+    the values must match exactly, or a very simple error message is displayed."""
+    msg = PASS
+    if 'numpy' in repr(type((actual))):
+        actual = actual.item()
+    if isinstance(expected, type):
+        if expected != actual:
+            if isinstance(actual, type):
+                msg = "expected %s but found %s" % (expected.__name__, actual.__name__)
+            else:
+                msg = "expected %s but found %s" % (expected.__name__, repr(actual))
+    elif not isinstance(actual, type(expected)) and not (isinstance(expected, (float, int)) and isinstance(actual, (float, int))):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+    elif isinstance(expected, float):
+        if not math.isclose(actual, expected, rel_tol=REL_TOL, abs_tol=ABS_TOL):
+            msg = print_message(expected, actual, complete_msg)
+    elif isinstance(expected, (list, tuple)) or is_namedtuple(expected):
+        new_msg = print_message(expected, actual, complete_msg)
+        if len(expected) != len(actual):
+            return new_msg
+        for i in range(len(expected)):
+            val = simple_compare(expected[i], actual[i])
+            if val != PASS:
+                return new_msg
+    elif isinstance(expected, dict):
+        new_msg = print_message(expected, actual, complete_msg)
+        if len(expected) != len(actual):
+            return new_msg
+        val = simple_compare(list(expected.keys()), list(actual.keys()))
+        if val != PASS:
+            return new_msg
+        for key in expected:
+            val = simple_compare(expected[key], actual[key])
+            if val != PASS:
+                return new_msg
+    else:
+        if expected != actual:
+            msg = print_message(expected, actual, complete_msg)
+    return msg
+
+
+def intelligent_compare(expected, actual, obj=None):
+    """intelligent_compare(expected, actual) is used to compare when the
+    data type of the expected answer is not known for certain, and default
+    assumptions  need to be made."""
+    if obj == None:
+        obj = type(expected).__name__
+    if is_namedtuple(expected):
+        msg = namedtuple_compare(expected, actual)
+    elif isinstance(expected, (list, tuple)):
+        msg = list_compare_ordered(expected, actual, obj)
+    elif isinstance(expected, set):
+        msg = list_compare_unordered(expected, actual, obj)
+    elif isinstance(expected, (dict)):
+        msg = dict_compare(expected, actual)
+    else:
+        msg = simple_compare(expected, actual)
+    msg = msg.replace("CompDict", "dict").replace("CompSet", "set").replace("NewNone", "None")
+    return msg
+
+
+def is_namedtuple(obj, init_check=True):
+    """is_namedtuple(obj) returns True if `obj` is a namedtuple object
+    defined in the test file."""
+    bases = type(obj).__bases__
+    if len(bases) != 1 or bases[0] != tuple:
+        return False
+    fields = getattr(type(obj), '_fields', None)
+    if not isinstance(fields, tuple):
+        return False
+    if init_check and not type(obj).__name__ in [nt.__name__ for nt in _expected_namedtuples]:
+        return False
+    return True
+
+
+def list_compare_ordered(expected, actual, obj=None):
+    """list_compare_ordered(expected, actual) is used to compare when the
+    expected answer is a list/tuple, where the order of the elements matters."""
+    msg = PASS
+    if not isinstance(actual, type(expected)):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+        return msg
+    if obj == None:
+        obj = type(expected).__name__
+    for i in range(len(expected)):
+        if i >= len(actual):
+            msg = "at index %d of the %s, expected missing %s" % (i, obj, repr(expected[i]))
+            break
+        val = intelligent_compare(expected[i], actual[i], "sub" + obj)
+        if val != PASS:
+            msg = "at index %d of the %s, " % (i, obj) + val
+            break
+    if len(actual) > len(expected) and msg == PASS:
+        msg = "at index %d of the %s, found unexpected %s" % (len(expected), obj, repr(actual[len(expected)]))
+    if len(expected) != len(actual):
+        msg = msg + " (found %d entries in %s, but expected %d)" % (len(actual), obj, len(expected))
+
+    if len(expected) > 0:
+        try:
+            if msg != PASS and list_compare_unordered(expected, actual, obj) == PASS:
+                msg = msg + " (%s may not be ordered as required)" % (obj)
+        except:
+            pass
+    return msg
+
+
+def list_compare_helper(larger, smaller):
+    """list_compare_helper(larger, smaller) is a helper function which takes in
+    two lists of possibly unequal sizes and finds the item that is not present
+    in the smaller list, if there is such an element."""
+    msg = PASS
+    j = 0
+    for i in range(len(larger)):
+        if i == len(smaller):
+            msg = "expected %s" % (repr(larger[i]))
+            break
+        found = False
+        while not found:
+            if j == len(smaller):
+                val = simple_compare(larger[i], smaller[j - 1], complete_msg=False)
+                break
+            val = simple_compare(larger[i], smaller[j], complete_msg=False)
+            j += 1
+            if val == PASS:
+                found = True
+                break
+        if not found:
+            msg = val
+            break
+    return msg
+
+class NewNone():
+    """alternate class in place of None, which allows for comparison with
+    all other data types."""
+    def __str__(self):
+        return 'None'
+    def __repr__(self):
+        return 'None'
+    def __lt__(self, other):
+        return True
+    def __le__(self, other):
+        return True
+    def __gt__(self, other):
+        return False
+    def __ge__(self, other):
+        return other == None
+    def __eq__(self, other):
+        return other == None
+    def __ne__(self, other):
+        return other != None
+
+class CompDict(dict):
+    """subclass of dict, which allows for comparison with other dicts."""
+    def __init__(self, vals):
+        super(self.__class__, self).__init__(vals)
+        if type(vals) == CompDict:
+            self.val = vals.val
+        elif isinstance(vals, dict):
+            self.val = self.get_equiv(vals)
+        else:
+            raise TypeError("'%s' object cannot be type casted to CompDict class" % type(vals).__name__)
+
+    def get_equiv(self, vals):
+        val = []
+        for key in sorted(list(vals.keys())):
+            val.append((key, vals[key]))
+        return val
+
+    def __str__(self):
+        return str(dict(self.val))
+    def __repr__(self):
+        return repr(dict(self.val))
+    def __lt__(self, other):
+        return self.val < CompDict(other).val
+    def __le__(self, other):
+        return self.val <= CompDict(other).val
+    def __gt__(self, other):
+        return self.val > CompDict(other).val
+    def __ge__(self, other):
+        return self.val >= CompDict(other).val
+    def __eq__(self, other):
+        return self.val == CompDict(other).val
+    def __ne__(self, other):
+        return self.val != CompDict(other).val
+
+class CompSet(set):
+    """subclass of set, which allows for comparison with other sets."""
+    def __init__(self, vals):
+        super(self.__class__, self).__init__(vals)
+        if type(vals) == CompSet:
+            self.val = vals.val
+        elif isinstance(vals, set):
+            self.val = self.get_equiv(vals)
+        else:
+            raise TypeError("'%s' object cannot be type casted to CompSet class" % type(vals).__name__)
+
+    def get_equiv(self, vals):
+        return sorted(list(vals))
+
+    def __str__(self):
+        return str(set(self.val))
+    def __repr__(self):
+        return repr(set(self.val))
+    def __getitem__(self, index):
+        return self.val[index]
+    def __lt__(self, other):
+        return self.val < CompSet(other).val
+    def __le__(self, other):
+        return self.val <= CompSet(other).val
+    def __gt__(self, other):
+        return self.val > CompSet(other).val
+    def __ge__(self, other):
+        return self.val >= CompSet(other).val
+    def __eq__(self, other):
+        return self.val == CompSet(other).val
+    def __ne__(self, other):
+        return self.val != CompSet(other).val
+
+def make_sortable(item):
+    """make_sortable(item) replaces all Nones in `item` with an alternate
+    class that allows for comparison with str/int/float/bool/list/set/tuple/dict.
+    It also replaces all dicts (and sets) with a subclass that allows for
+    comparison with other dicts (and sets)."""
+    if item == None:
+        return NewNone()
+    elif isinstance(item, (type, str, int, float, bool)):
+        return item
+    elif isinstance(item, (list, set, tuple)):
+        new_item = []
+        for subitem in item:
+            new_item.append(make_sortable(subitem))
+        if is_namedtuple(item):
+            return type(item)(*new_item)
+        elif isinstance(item, set):
+            return CompSet(new_item)
+        else:
+            return type(item)(new_item)
+    elif isinstance(item, dict):
+        new_item = {}
+        for key in item:
+            new_item[key] = make_sortable(item[key])
+        return CompDict(new_item)
+    return item
+
+def list_compare_unordered(expected, actual, obj=None):
+    """list_compare_unordered(expected, actual) is used to compare when the
+    expected answer is a list/set where the order of the elements does not matter."""
+    msg = PASS
+    if not isinstance(actual, type(expected)):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+        return msg
+    if obj == None:
+        obj = type(expected).__name__
+
+    try:
+        sort_expected = sorted(make_sortable(expected))
+        sort_actual = sorted(make_sortable(actual))
+    except:
+        return "unexpected datatype found in %s; expected entries of type %s" % (obj, obj, type(expected[0]).__name__)
+
+    if len(actual) == 0 and len(expected) > 0:
+        msg = "in the %s, missing" % (obj) + sort_expected[0]
+    elif len(actual) > 0 and len(expected) > 0:
+        val = intelligent_compare(sort_expected[0], sort_actual[0])
+        if val.startswith("expected to find type"):
+            msg = "in the %s, " % (obj) + simple_compare(sort_expected[0], sort_actual[0])
+        else:
+            if len(expected) > len(actual):
+                msg = "in the %s, missing " % (obj) + list_compare_helper(sort_expected, sort_actual)
+            elif len(expected) < len(actual):
+                msg = "in the %s, found un" % (obj) + list_compare_helper(sort_actual, sort_expected)
+            if len(expected) != len(actual):
+                msg = msg + " (found %d entries in %s, but expected %d)" % (len(actual), obj, len(expected))
+                return msg
+            else:
+                val = list_compare_helper(sort_expected, sort_actual)
+                if val != PASS:
+                    msg = "in the %s, missing " % (obj) + val + ", but found un" + list_compare_helper(sort_actual,
+                                                                                               sort_expected)
+    return msg
+
+
+def namedtuple_compare(expected, actual):
+    """namedtuple_compare(expected, actual) is used to compare when the
+    expected answer is a namedtuple defined in the test file."""
+    msg = PASS
+    if is_namedtuple(actual, False):
+        msg = "expected namedtuple but found %s" % (type(actual).__name__)
+        return msg
+    if type(expected).__name__ != type(actual).__name__:
+        return "expected namedtuple %s but found namedtuple %s" % (type(expected).__name__, type(actual).__name__)
+    expected_fields = expected._fields
+    actual_fields = actual._fields
+    msg = list_compare_ordered(list(expected_fields), list(actual_fields), "namedtuple attributes")
+    if msg != PASS:
+        return msg
+    for field in expected_fields:
+        val = intelligent_compare(getattr(expected, field), getattr(actual, field))
+        if val != PASS:
+            msg = "at attribute %s of namedtuple %s, " % (field, type(expected).__name__) + val
+            return msg
+    return msg
+
+
+def clean_slashes(item):
+    """clean_slashes()"""
+    if isinstance(item, str):
+        return item.replace("\\", "/").replace("/", os.path.sep)
+    elif item == None or isinstance(item, (type, int, float, bool)):
+        return item
+    elif isinstance(item, (list, tuple, set)) or is_namedtuple(item):
+        new_item = []
+        for subitem in item:
+            new_item.append(clean_slashes(subitem))
+        if is_namedtuple(item):
+            return type(item)(*new_item)
+        else:
+            return type(item)(new_item)
+    elif isinstance(item, dict):
+        new_item = {}
+        for key in item:
+            new_item[clean_slashes(key)] = clean_slashes(item[key])
+        return item
+
+
+def list_compare_special_initialize(special_expected):
+    """list_compare_special_initialize(special_expected) takes in the special
+    ordering stored as a sorted list of items, and returns a list of lists
+    where the ordering among the inner lists does not matter."""
+    latest_val = None
+    clean_special = []
+    for row in special_expected:
+        if latest_val == None or row[1] != latest_val:
+            clean_special.append([])
+            latest_val = row[1]
+        clean_special[-1].append(row[0])
+    return clean_special
+
+
+def list_compare_special(special_expected, actual):
+    """list_compare_special(special_expected, actual) is used to compare when the
+    expected answer is a list with special ordering defined in `special_expected`."""
+    msg = PASS
+    expected_list = []
+    special_order = list_compare_special_initialize(special_expected)
+    for expected_item in special_order:
+        expected_list.extend(expected_item)
+    val = list_compare_unordered(expected_list, actual)
+    if val != PASS:
+        return val
+    i = 0
+    for expected_item in special_order:
+        j = len(expected_item)
+        actual_item = actual[i: i + j]
+        val = list_compare_unordered(expected_item, actual_item)
+        if val != PASS:
+            if j == 1:
+                msg = "at index %d " % (i) + val
+            else:
+                msg = "between indices %d and %d " % (i, i + j - 1) + val
+            msg = msg + " (list may not be ordered as required)"
+            break
+        i += j
+    return msg
+
+
+def dict_compare(expected, actual, obj=None):
+    """dict_compare(expected, actual) is used to compare when the expected answer
+    is a dict."""
+    msg = PASS
+    if not isinstance(actual, type(expected)):
+        msg = "expected to find type %s but found type %s" % (type(expected).__name__, type(actual).__name__)
+        return msg
+    if obj == None:
+        obj = type(expected).__name__
+
+    expected_keys = list(expected.keys())
+    actual_keys = list(actual.keys())
+    val = list_compare_unordered(expected_keys, actual_keys, obj)
+
+    if val != PASS:
+        msg = "bad keys in %s: " % (obj) + val
+    if msg == PASS:
+        for key in expected:
+            new_obj = None
+            if isinstance(expected[key], (list, tuple, set)):
+                new_obj = 'value'
+            elif isinstance(expected[key], dict):
+                new_obj = 'sub' + obj
+            val = intelligent_compare(expected[key], actual[key], new_obj)
+            if val != PASS:
+                msg = "incorrect value for key %s in %s: " % (repr(key), obj) + val
+    return msg
+
+
+def is_flippable(item):
+    """is_flippable(item) determines if the given dict of lists has lists of the
+    same length and is therefore flippable."""
+    item_lens = set(([str(len(item[key])) for key in item]))
+    if len(item_lens) == 1:
+        return PASS
+    else:
+        return "found lists of lengths %s" % (", ".join(list(item_lens)))
+
+def flip_dict_of_lists(item):
+    """flip_dict_of_lists(item) flips a dict of lists into a list of dicts if the
+    lists are of same length."""
+    new_item = []
+    length = len(list(item.values())[0])
+    for i in range(length):
+        new_dict = {}
+        for key in item:
+            new_dict[key] = item[key][i]
+        new_item.append(new_dict)
+    return new_item
+
+def compare_flip_dicts(expected, actual, obj="lists"):
+    """compare_flip_dicts(expected, actual) flips a dict of lists (or dicts) into
+    a list of dicts (or dict of dicts) and then compares the list ignoring order."""
+    msg = PASS
+    example_item = list(expected.values())[0]
+    if isinstance(example_item, (list, tuple)):
+        val = is_flippable(actual)
+        if val != PASS:
+            msg = "expected to find lists of length %d, but " % (len(example_item)) + val
+            return msg
+        msg = list_compare_unordered(flip_dict_of_lists(expected), flip_dict_of_lists(actual), "lists")
+    elif isinstance(example_item, dict):
+        expected_keys = list(example_item.keys())
+        for key in actual:
+            val = list_compare_unordered(expected_keys, list(actual[key].keys()), "dictionary %s" % key)
+            if val != PASS:
+                return val
+        for cat_key in expected_keys:
+            expected_category = {}
+            actual_category = {}
+            for key in expected:
+                expected_category[key] = expected[key][cat_key]
+                actual_category[key] = actual[key][cat_key]
+            val = list_compare_unordered(flip_dict_of_lists(expected), flip_dict_of_lists(actual), "category " + repr(cat_key))
+            if val != PASS:
+                return val
+    return msg
+
+
+def get_expected_tables():
+    """get_expected_tables() reads the html file with the expected DataFrames
+    and returns a dict mapping each question to a html table."""
+    if not os.path.exists(DF_FILE):
+        return None
+
+    expected_tables = {}
+    f = open(DF_FILE, encoding='utf-8')
+    soup = BeautifulSoup(f.read(), 'html.parser')
+    f.close()
+
+    tables = soup.find_all('table')
+    for table in tables:
+        expected_tables[table.get("data-question")] = table
+
+    return expected_tables
+
+def parse_df_html_table(table):
+    """parse_df_html_table(table) takes in a table as a html string and returns
+    a dict mapping each row and column index to the value at that position."""
+    rows = []
+    for tr in table.find_all('tr'):
+        rows.append([])
+        for cell in tr.find_all(['td', 'th']):
+            rows[-1].append(cell.get_text().strip("\n "))
+
+    cells = {}
+    for r in range(1, len(rows)):
+        for c in range(1, len(rows[0])):
+            rname = rows[r][0]
+            cname = rows[0][c]
+            cells[(rname,cname)] = rows[r][c]
+    return cells
+
+
+def get_expected_namedtuples():
+    """get_expected_namedtuples() defines the required namedtuple objects
+    globally. It also returns a tuple of the classes."""
+    expected_namedtuples = []
+    
+    return tuple(expected_namedtuples)
+
+_expected_namedtuples = get_expected_namedtuples()
+
+
+def compare_cell_html(expected, actual):
+    """compare_cell_html(expected, actual) is used to compare when the
+    expected answer is a DataFrame stored in the `expected_dfs` html file."""
+    expected_cells = parse_df_html_table(expected)
+    try:
+        actual_cells = parse_df_html_table(BeautifulSoup(actual, 'html.parser').find('table'))
+    except Exception as e:
+        return "expected to find type DataFrame but found type %s instead" % type(actual).__name__
+
+    expected_cols = list(set(["column %s" % (loc[1]) for loc in expected_cells]))
+    actual_cols = list(set(["column %s" % (loc[1]) for loc in actual_cells]))
+    msg = list_compare_unordered(expected_cols, actual_cols, "DataFrame")
+    if msg != PASS:
+        return msg
+
+    expected_rows = list(set(["row index %s" % (loc[0]) for loc in expected_cells]))
+    actual_rows = list(set(["row index %s" % (loc[0]) for loc in actual_cells]))
+    msg = list_compare_unordered(expected_rows, actual_rows, "DataFrame")
+    if msg != PASS:
+        return msg
+
+    for location, expected in expected_cells.items():
+        location_name = "column {} at index {}".format(location[1], location[0])
+        actual = actual_cells.get(location, None)
+        if actual == None:
+            return "in %s, expected to find %s" % (location_name, repr(expected))
+        try:
+            actual_ans = float(actual)
+            expected_ans = float(expected)
+            if math.isnan(actual_ans) and math.isnan(expected_ans):
+                continue
+        except Exception as e:
+            actual_ans, expected_ans = actual, expected
+        msg = simple_compare(expected_ans, actual_ans)
+        if msg != PASS:
+            return "in %s, " % location_name + msg
+    return PASS
+
+
+def get_expected_plots():
+    """get_expected_plots() reads the json file with the expected plot data
+    and returns a dict mapping each question to a dictionary with the plots data."""
+    if not os.path.exists(PLOT_FILE):
+        return None
+
+    f = open(PLOT_FILE, encoding='utf-8')
+    expected_plots = json.load(f)
+    f.close()
+    return expected_plots
+
+
+def compare_file_json(expected, actual):
+    """compare_file_json(expected, actual) is used to compare when the
+    expected answer is a JSON file."""
+    msg = PASS
+    if not os.path.isfile(expected):
+        return "file %s not found; make sure it is downloaded and stored in the correct directory" % (expected)
+    elif not os.path.isfile(actual):
+        return "file %s not found; make sure that you have created the file with the correct name" % (actual)
+    try:
+        e = open(expected, encoding='utf-8')
+        expected_data = json.load(e)
+        e.close()
+    except json.JSONDecodeError:
+        return "file %s is broken and cannot be parsed; please delete and redownload the file correctly" % (expected)
+    try:
+        a = open(actual, encoding='utf-8')
+        actual_data = json.load(a)
+        a.close()
+    except json.JSONDecodeError:
+        return "file %s is broken and cannot be parsed" % (actual)
+    if type(expected_data) == list:
+        msg = list_compare_ordered(expected_data, actual_data, 'file ' + actual)
+    elif type(expected_data) == dict:
+        msg = dict_compare(expected_data, actual_data)
+    return msg
+
+
+_expected_json = get_expected_json()
+_special_json = get_special_json()
+_expected_plots = get_expected_plots()
+_expected_tables = get_expected_tables()
+_expected_format = get_expected_format()
+
+def check(qnum, actual):
+    """check(qnum, actual) is used to check if the answer in the notebook is
+    the correct answer, and provide useful feedback if the answer is incorrect."""
+    msg = PASS
+    error_msg = "<b style='color: red;'>ERROR:</b> "
+    q_format = _expected_format[qnum]
+
+    if q_format == TEXT_FORMAT_SPECIAL_ORDERED_LIST:
+        expected = _special_json[qnum]
+    elif q_format == PNG_FORMAT_SCATTER:
+        if _expected_plots == None:
+            msg = error_msg + "file %s not parsed; make sure it is downloaded and stored in the correct directory" % (PLOT_FILE)
+        else:
+            expected = _expected_plots[qnum]
+    elif q_format == HTML_FORMAT:
+        if _expected_tables == None:
+            msg = error_msg + "file %s not parsed; make sure it is downloaded and stored in the correct directory" % (DF_FILE)
+        else:
+            expected = _expected_tables[qnum]
+    else:
+        expected = _expected_json[qnum]
+
+    if SLASHES in q_format:
+        q_format = q_format.replace(SLASHES, "")
+        expected = clean_slashes(expected)
+        actual = clean_slashes(actual)
+
+    if msg != PASS:
+        print(msg)
+    else:
+        msg = compare(expected, actual, q_format)
+        if msg != PASS:
+            msg = error_msg + msg
+        print(msg)
+
+
+def check_file_size(path):
+    """check_file_size(path) throws an error if the file is too big to display
+    on Gradescope."""
+    size = os.path.getsize(path)
+    assert size < MAX_FILE_SIZE * 10**3, "Your file is too big to be displayed by Gradescope; please delete unnecessary output cells so your file size is < %s KB" % MAX_FILE_SIZE
+
+
+def reset_hidden_tests():
+    """reset_hidden_tests() resets all hidden tests on the Gradescope autograder where the hidden test file exists"""
+    if not os.path.exists(HIDDEN_FILE):
+        return
+    hidn.reset_hidden_tests()
+
+def rubric_check(rubric_point, ignore_past_errors=True):
+    """rubric_check(rubric_point) uses the hidden test file on the Gradescope autograder to grade the `rubric_point`"""
+    if not os.path.exists(HIDDEN_FILE):
+        print(PASS)
+        return
+    error_msg_1 = "ERROR: "
+    error_msg_2 = "TEST DETAILS: "
+    try:
+        msg = hidn.rubric_check(rubric_point, ignore_past_errors)
+    except:
+        msg = "hidden tests crashed before execution"
+    if msg != PASS:
+        hidn.make_deductions(rubric_point)
+        if msg == "public tests failed":
+            comment = "The public tests have failed, so you will not receive any points for this question."
+            comment += "\nPlease confirm that the public tests pass locally before submitting."
+        elif msg == "answer is hardcoded":
+            comment = "In the datasets for testing hardcoding, all numbers are replaced with random values."
+            comment += "\nIf the answer is the same as in the original dataset for all these datasets"
+            comment += "\ndespite this, that implies that the answer in the notebook is hardcoded."
+            comment += "\nYou will not receive any points for this question."
+        else:
+            comment = hidn.get_comment(rubric_point)
+        msg = error_msg_1 + msg
+        if comment != "":
+            msg = msg + "\n" + error_msg_2 + comment
+    print(msg)
+
+def get_summary():
+    """get_summary() returns the summary of the notebook using the hidden test file on the Gradescope autograder"""
+    if not os.path.exists(HIDDEN_FILE):
+        print("Total Score: %d/%d" % (TOTAL_SCORE, TOTAL_SCORE))
+        return
+    score = min(TOTAL_SCORE, hidn.get_score(TOTAL_SCORE))
+    display_msg = "Total Score: %d/%d" % (score, TOTAL_SCORE)
+    if score != TOTAL_SCORE:
+        display_msg += "\n" + hidn.get_deduction_string()
+    print(display_msg)
+
+def get_score_digit(digit):
+    """get_score_digit(digit) returns the `digit` of the score using the hidden test file on the Gradescope autograder"""
+    if not os.path.exists(HIDDEN_FILE):
+        score = TOTAL_SCORE
+    else:
+        score = hidn.get_score(TOTAL_SCORE)
+    digits = bin(score)[2:]
+    digits = "0"*(7 - len(digits)) + digits
+    return int(digits[6 - digit])
diff --git a/p5/rubric.md b/p5/rubric.md
new file mode 100644
index 0000000000000000000000000000000000000000..3f07999b3c9a02337c80cbf3e18bae988134b168
--- /dev/null
+++ b/p5/rubric.md
@@ -0,0 +1,139 @@
+# Project 5 (P5) grading rubric
+
+## Code reviews
+
+- The Gradescope autograder will make deductions based on the rubric provided below.
+- To ensure that you don't lose any points, you must review the rubric and make sure that you have followed the instructions provided in the project correctly.
+
+## Rubric
+
+### General guidelines:
+
+- Did not save the notebook file prior to running the cell containing "export". We cannot see your output if you do not save before generating the zip file. This deduction will become stricter for future projects. (-3)
+- Functions are defined more than once. (-3)
+- Import statements are not all placed at the top of the notebook. (-1)
+- Used concepts or modules not covered in class yet. (-5)
+- Hardcoded answers. (all points allotted for that question)
+
+### Question specific guidelines:
+
+- q1 (2)
+	- required function is not used (-1)
+
+- q2 (3)
+	- required function is not used (-1)
+
+- q3 (4)
+	- index of the last hurricane is hardcoded (-3)
+	- required function is not used (-1)
+
+- q4 (4)
+  - incorrect logic is used to answer (-2)
+  - number of hurricanes in the dataset is hardcoded (-1)
+
+- q5 (4)
+- incorrect logic is used to answer (-2)
+- number of hurricanes in the dataset is hardcoded (-1)
+
+- `format_damage` (4)
+  - function output is incorrect when the damage has suffix `K` (-1)
+	- function output is incorrect when the damage has suffix `M` (-1)
+	- function output is incorrect when the damage has suffix `B` (-1)
+	- function output is incorrect when the damage has no suffix (-1)
+
+- q6 (4)
+	- did not exit loop and instead iterated further after finding the hurricane (-2)
+	- number of hurricanes in the dataset is hardcoded (-1)
+	- `format_damage` function is not used to convert the damages into an integer (-1)
+
+- q7 (4)
+	- incorrect logic is used to answer (-2)
+	- number of hurricanes in the dataset is hardcoded (-1)
+	- `format_damage` function is not used to convert the damages into an integer (-1)
+
+- q8 (4)
+	- incorrect logic is used to answer (-2)
+	- number of hurricanes in the dataset is hardcoded (-1)
+	- `format_damage` function is not used to convert the damages into an integer (-1)
+
+- q9 (4)
+	- incorrect logic is used to answer (-2)
+	- tie breaking is not implemented correctly (-1)
+	- number of hurricanes in the dataset is hardcoded (-1)
+
+- q10 (4)
+	- incorrect logic is used to answer (-2)
+	- tie breaking is not implemented correctly (-1)
+	- number of hurricanes in the dataset is hardcoded (-1)
+
+- `get_year` (2)
+	- function logic is incorrect (-2)
+
+- `get_month` (2)
+	- function logic is incorrect (-2)
+
+- `get_day` (2)
+	- function logic is incorrect (-2)
+
+- q11 (5)
+  - variable to store the index or name of the earliest hurricane is not initialized as `None` (-1)
+	- `get_year` function is not used to determine the year of formation (-1)
+	- used indices of the hurricanes to determine the earliest hurricane (-1)
+	- hurricanes with damages <= 1B are not ignored (-1)
+	- number of hurricanes in the dataset is hardcoded (-1)
+
+- q12 (5)
+	- variable to store the index or name of the most recent hurricane is not initialized as `None` (-1)
+	- `get_year` function is not used to determine the year of formation (-1)
+	- used indices of the hurricanes to determine the most recent hurricane (-1)
+	- hurricanes with damages <= 100B are not ignored (-1)
+	- number of hurricanes in the dataset is hardcoded (-1)
+
+- `deadliest_in_range` (4)
+  - variable to store the index of the deadliest hurricane is not initialized as `None` (-1)
+	- function does not consider all hurricanes active between `year1` and `year2` (-1)
+	- number of hurricanes in the dataset is hardcoded (-1)
+	- function logic is incorrect (-1)
+
+- q13 (4)
+  - functions `deadliest_in_range` and `format_damage` are not used to answer (-2)
+	- incorrect logic is used to answer (-2)
+
+- q14 (4)
+	- function `deadliest_in_range` is not used to answer (-2)
+	- incorrect logic is used to answer (-2)
+
+- q15 (4)  
+	- functions `get_year` and `get_month` are not used to answer (-1)
+	- incorrect logic is used to answer (-1)
+	- number of hurricanes in the dataset is hardcoded (-1)
+
+- `get_year_total` (4)
+	- function logic is incorrect (-2)
+	- function `get_year` is not used to answer (-1)
+	- number of hurricanes in the dataset is hardcoded (-1)
+
+- q16 (4)
+	- function `get_year_total` is not used to answer (-3)
+
+- q17 (5)
+  - function `get_year_total` is not used to answer (-2)
+	- did not loop through the years in the last decade and hardcoded all ten years (-2)
+	- incorrect logic is used to answer (-1)
+
+- q18 (5)
+  - `year_with_most_hurricanes` is not initialized as some year in the twentieth century, or as `None` (-2)
+	- function `get_year_total` is not used to determine the year with the most hurricanes (-1)
+	- tie breaking is not implemented correctly (-1)
+	- incorrect logic is used to answer (-1)
+
+- q19 (4)
+  - hurricanes that formed at the end of one year and dissipated at the end of the next are not considered (-1)
+	- incorrect logic is used to answer (-1)
+	- function `get_month` is not used to answer (-1)
+
+- q20 (5)
+  - years with no deadliest hurricane are not ignored (-1)
+	- all hurricanes formed between 2001 and 2023 are not considered (-1)
+	- incorrect logic is used to answer (-1)
+	- functions `deadliest_in_range` and `format_damage` are not used to answer (-1)