{"id":2158,"date":"2019-02-12T05:37:41","date_gmt":"2019-02-12T05:37:41","guid":{"rendered":"http:\/\/sadievrenseker.com\/wp\/?p=2158"},"modified":"2019-05-08T06:53:43","modified_gmt":"2019-05-08T06:53:43","slug":"cs447-introduction-to-data-science","status":"publish","type":"post","link":"https:\/\/sadievrenseker.com\/?p=2158","title":{"rendered":"CS447 Introduction to Data Science"},"content":{"rendered":"\r\n\r\n\r\n<p><strong>Antalya Science University<\/strong><\/p>\r\n\r\n\r\n\r\n<p><strong>Course Name: Introduction\u00a0to\u00a0Data Science<\/strong><\/p>\r\n\r\n\r\n\r\n<p><strong>Course Code:<\/strong>\u00a0CS 447<\/p>\r\n\r\n\r\n\r\n<p><strong>Language of Course:<\/strong>\u00a0English<\/p>\r\n\r\n\r\n\r\n<p><strong>Credit:<\/strong>\u00a03<\/p>\r\n\r\n\r\n\r\n<p><strong>Course Coordinator \/ Instructor:<\/strong>\u00a0\u015eadi Evren \u015eEKER<\/p>\r\n\r\n\r\n\r\n<p><strong>Contact:<\/strong>\u00a0intrds@sadievrenseker.com<\/p>\r\n\r\n\r\n\r\n<p><strong>Schedule:<\/strong>\u00a0Tuesday 15.00 &#8211; 18.00<\/p>\r\n\r\n\r\n\r\n<p><strong>Course Description: \u00a0<\/strong>This course is an introduction level course to data science, specialized on machine learning, artificial intelligence and big data.<\/p>\r\n\r\n\r\n\r\n<ul>\r\n<li>The course starts with a top down approach to data science projects. The first step is covering data science project management techniques and we follow\u00a0<strong>CRISP-DM<\/strong>\u00a0methodology with 6 steps below:<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>Business Understanding :<\/strong>\u00a0We cover the types of problems and business processes in real life<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>Data Understanding:<\/strong>\u00a0We cover the data types and data problems. We also try to visualize data to discover.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>Data Preprocessing:\u00a0<\/strong>We cover the classical problems on data and also handling the problems like\u00a0<strong>noisy or dirty data and missing values.\u00a0<\/strong>Row or column\u00a0<strong>filtering, data integration with concatenation and joins<\/strong>. We cover the data transformation such as\u00a0<strong>discretization, normalization, or pivoting<\/strong>.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>Machine Learning:<\/strong>\u00a0we cover the classification algorithms such as\u00a0<em>Naive Bayes, Decision Trees, Logistic Regression or K-NN.<\/em>\u00a0We also cover prediction \/\u00a0regression algorithms like linear regression, polynomial regression or decision tree regression. We also cover unsupervised learning problems like clustering and association rule learning with k-means or hierarchical clustering, and a priori algorithms. Finally we cover\u00a0<strong>ensemble techniques<\/strong>\u00a0in Knime and Python on Big Data Platforms.<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>Evaluation:\u00a0<\/strong>In the final step of data science, we study the metrics of success via Confusion Matrix, Precision, Recall, Sensitivity, Specificity for classification; purity , randindex for Clustering and\u00a0 rmse, rmae, mse, mae for Regression \/\u00a0Prediction problems with Knime and Python on Big Data Platforms.<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n\r\n\r\n<p><strong>Course Objective and Learning Outcomes:\u00a0<\/strong><\/p>\r\n\r\n\r\n\r\n<p>1.\u00a0\u00a0\u00a0\u00a0 Understanding of real life cases about data<\/p>\r\n\r\n\r\n\r\n<p>2.\u00a0\u00a0\u00a0\u00a0 Understanding of real life data related problems<\/p>\r\n\r\n\r\n\r\n<p>3.\u00a0\u00a0\u00a0\u00a0 Understanding of data analysis methodologies<\/p>\r\n\r\n\r\n\r\n<p>4.\u00a0\u00a0\u00a0\u00a0 Understanding of some basic data operations like: preprocessing, transformation or manipulation<\/p>\r\n\r\n\r\n\r\n<p>5.\u00a0\u00a0\u00a0\u00a0 Understanding of new technologies like bigdata, nosql, cloud computing<\/p>\r\n\r\n\r\n\r\n<p>6.\u00a0\u00a0\u00a0\u00a0 Ability to use some trending software in the industry<\/p>\r\n\r\n\r\n\r\n<p>7.\u00a0\u00a0\u00a0\u00a0 Introduction to data related problems and their applications<\/p>\r\n\r\n\r\n\r\n\r\n\r\n<p><strong>Tools:<\/strong><\/p>\r\n\r\n\r\n\r\n<p>List of course software:<\/p>\r\n\r\n\r\n\r\n<p>\u00b7\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Excel,<\/p>\r\n\r\n\r\n\r\n<p>\u00b7\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 KNIME,<\/p>\r\n\r\n\r\n\r\n<p>\u00b7\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Python Programming with Numpy, Pandas, SKLearn, StatsModel or DASK<\/p>\r\n\r\n\r\n\r\n\r\n\r\n<p>This course is following hands on experience in all the steps. So attendance with laptop computers is necessary. Also the software list above, will be provided during the course and the list is subject to updates.<\/p>\r\n\r\n\r\n\r\n<p><strong>Grading<\/strong><\/p>\r\n\r\n\r\n\r\n<p>Reading, Attendence and Discussions:\u00a030%<\/p>\r\n\r\n\r\n\r\n<p>Homeworks:\u00a030%<\/p>\r\n\r\n\r\n\r\n<p>Project:\u00a040%<\/p>\r\n\r\n\r\n\r\n<p><strong>Course Content:<\/strong><\/p>\r\n\r\n\r\n\r\n<table class=\"wp-block-table\">\r\n<tbody>\r\n<tr>\r\n<td><strong>Week 1 (Feb 19): Introduction to Data, Problems and Real World Examples:<\/strong>Some useful information:DIKW Pyramid:\u00a0<a href=\"https:\/\/www.google.com.tr\/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=1&amp;cad=rja&amp;uact=8&amp;ved=0ahUKEwiIs6utyKDXAhXqIJoKHRIdDNAQFggnMAA&amp;url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FDIKW_pyramid&amp;usg=AOvVaw1ddCSlI29On5ZqRhf1vREE\">DIKW pyramid \u2013 Wikipedia<\/a>CRISP-DM:\u00a0<a href=\"https:\/\/www.google.com.tr\/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=4&amp;cad=rja&amp;uact=8&amp;ved=0ahUKEwjT37ecyKDXAhUoYpoKHVVtAlsQFgg3MAM&amp;url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FCross-industry_standard_process_for_data_mining&amp;usg=AOvVaw0_SytZPUTYDZLBCbanvkr0\">Cross-industry standard process for data mining \u2013 Wikipedia<\/a>Slides from first week:<a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2017\/11\/week1-1.pdf\">week1<\/a><\/td>\r\n<\/tr>\r\n<tr>\r\n<td><strong>Week 2 (Feb 26): Introduction to Descriptive Analytics<\/strong><br \/>Repeating the first week for majority of the class and starting the concept of end to end data science projects. Weight and Heigh Sample project and Data Set for Knime work flow. Brief introduction to algorithms: K-NN, Naive Bayes, Decision Trees, Linear Regression<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 3 (Mar 5): Introduction to Data Manipulation<\/strong><br \/>Concept of Data and types of data : Categorical (Nominal, Ordinal) and Numerical (Interval, Ratio). <br \/>Basic Data Manipulation techniques with Knime: <br \/>1.Row Filter and Concept of Missing Values<br \/>2.Column Filter<br \/>3.Advanced Filters<br \/>4.Concatenate<br \/>5.Join<br \/>6. Group by , Aggregation<br \/>7. Formulas, String Replace<br \/>8. String Manipulation<br \/>9. Discrete, Quantized Data, Binning<br \/>10. Normalization<br \/>11.Splitting and Merging<br \/>12.Type Conversion (Numeric , String)<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/abu_preprocessing.knwf_.zip\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2169\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/abu_knime_preprocessing-1024x693.png\" alt=\"\" width=\"555\" height=\"375\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/abu_knime_preprocessing-1024x693.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/abu_knime_preprocessing-300x203.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/abu_knime_preprocessing-768x520.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/abu_knime_preprocessing.png 1706w\" sizes=\"auto, (max-width: 555px) 100vw, 555px\" \/><\/a><\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 4 (Mar. 12): Introduction to Python Programming for Data Science and an end-to-end Python application for data science<\/strong><br \/>Brief review of python programming<br \/>Introduction to data manipulation libraries: NumPY and Pandas<br \/>Introduction to the Sci-Kit Learn library and a sample classification<\/p>\r\n<p>You can install anaconda and Spyder from the link below:<\/p>\r\n<p><a href=\"http:\/\/www.anaconda.org\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2176\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/maxresdefault-3-300x169.jpg\" alt=\"\" width=\"300\" height=\"169\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/maxresdefault-3-300x169.jpg 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/maxresdefault-3-768x432.jpg 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/maxresdefault-3-1024x576.jpg 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/maxresdefault-3.jpg 1280w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p>Also we have covered below topics during the class:<\/p>\r\n<ul>\r\n<li>Data loading from external source using Pandas library (with read_excel or read_csv methods)<\/li>\r\n<li>DataFrame slicing and dicing (using the iloc property and the lists provided to the iloc method)<\/li>\r\n<li>Column Filtering (with copying into a new data frame)<\/li>\r\n<li>Row Filtering (with copying into a new data frame)<\/li>\r\n<li>Advanced row filtering (like filtering the people with even number of heights)<\/li>\r\n<li>Column or row wise formula (we have calculated the BMI for everybody)<\/li>\r\n<li>Quantization (discretization or binning): where we have applied the condition based binning\u00a0<\/li>\r\n<li>Min &#8211; Max Normalization (we have implemented MinMaxScaler from the SKLearn library)<\/li>\r\n<li>Group By operation (we have implemented the groupby method from pandas library)<\/li>\r\n<\/ul>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/anaconda_preprocessing.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2177\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/anaconda_preprocessing-300x198.png\" alt=\"\" width=\"300\" height=\"198\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/anaconda_preprocessing-300x198.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/anaconda_preprocessing-768x507.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/anaconda_preprocessing-1024x676.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/anaconda_preprocessing.png 1770w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><br \/><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/03\/Archive-6.zip\">Click here to download the codes from the class<\/a><\/p>\r\n<p>For further information I strongly suggest you to read the below documentations:<\/p>\r\n<ul>\r\n<li>Pandas Library :\u00a0<a href=\"https:\/\/pandas.pydata.org\/pandas-docs\/stable\/\">https:\/\/pandas.pydata.org\/pandas-docs\/stable\/<\/a><\/li>\r\n<li>Numpy Library :\u00a0<a href=\"http:\/\/www.numpy.org\">http:\/\/www.numpy.org<\/a><\/li>\r\n<li>SK Learn Library :\u00a0<a href=\"https:\/\/scikit-learn.org\/stable\/\">https:\/\/scikit-learn.org\/stable\/<\/a><\/li>\r\n<li>Pandas Data Frame (This is the main topic we have covered this week):\u00a0<a href=\"https:\/\/pandas.pydata.org\/pandas-docs\/stable\/reference\/api\/pandas.DataFrame.html\">https:\/\/pandas.pydata.org\/pandas-docs\/stable\/reference\/api\/pandas.DataFrame.html<\/a><\/li>\r\n<\/ul>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 5 (Mar 19): Classification Algorithms<\/strong><br \/>concepts of classification algorithms, implementing the algorithms in Knime and coding in python. Algorithms covered are:<br \/>K-NN<br \/>Naive Bayes<br \/>Decision Tree<br \/>Logistic Regression<br \/>Support Vector Machines<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu2ndweek.py_.zip\">2nd Python Code of the course for the classifications<\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_classification.knwf_.zip\">Knime Workflow for the classification algorithms<\/a><\/p>\r\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2182\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Mar-19-21-54-02-300x196.png\" alt=\"\" width=\"300\" height=\"196\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Mar-19-21-54-02-300x196.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Mar-19-21-54-02-768x501.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Mar-19-21-54-02-1024x667.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Mar-19-21-54-02.png 1562w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 6 (Mar 26): Regression Algorithms<\/strong><br \/>concepts of prediction algorithms, implementing the algorithms in Knime and coding in python. Algorithms covered are:<br \/>Linear Regression<br \/>Polynomial Regression<br \/>Support Vector Regressor<br \/>Regression Trees and Decision Tree Regressor<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Archive-3.zip\">Python code for the Regression<\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_regression.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2196\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_regression-300x196.png\" alt=\"\" width=\"300\" height=\"196\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_regression-300x196.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_regression-768x501.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_regression-1024x668.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_regression.png 1690w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_stockexchange.knwf_.zip\">Knime Workflow and the BIST 100 data set for the Regression Algorithms\u00a0<\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/stockexchange_bist_prediction.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2188\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/stockexchange_bist_prediction-300x213.png\" alt=\"\" width=\"300\" height=\"213\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/stockexchange_bist_prediction-300x213.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/stockexchange_bist_prediction-768x546.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/stockexchange_bist_prediction-1024x728.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/stockexchange_bist_prediction.png 1806w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p>The Data Set obtained from : <a href=\"http:\/\/finance.yahoo.com\">finance.yahoo.com<\/a><\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 7 (Apr 2): Clustering Algorithms<\/strong><br \/>concepts of clustering algorithms, implementing the algorithms in Knime and coding in python. Algorithms covered are:<br \/>K-Means<br \/>DBScan<br \/>Hierarchical Clustering<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_clustering.knwf_.zip\">Knime Workflow<\/a><\/p>\r\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2192\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/clustering-300x197.png\" alt=\"\" width=\"300\" height=\"197\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/clustering-300x197.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/clustering-768x504.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/clustering-1024x672.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/clustering.png 1310w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_clustering.py_.zip\">Python Code<\/a><\/p>\r\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2194\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_clustering_python-300x291.png\" alt=\"\" width=\"300\" height=\"291\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_clustering_python-300x291.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_clustering_python-768x746.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_clustering_python-1024x994.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_clustering_python.png 1318w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 8 (Apr 9): Association Rule Mining<\/strong><br \/>concepts of association rule mining (ARM) and association rule learning (ARL) algorithms, implementing the algorithms in Knime and coding in python. Algorithms covered are:<br \/>A-Priori Algorithm<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/apyori.py_.zip\">Click Here To Download Apyroiri Library for the Python Codes<\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Archive-2.zip\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2201\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-10-05-28-46-1-300x173.png\" alt=\"\" width=\"300\" height=\"173\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-10-05-28-46-1-300x173.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-10-05-28-46-1-768x442.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-10-05-28-46-1-1024x589.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-10-05-28-46-1.png 1394w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Archive-2.zip\">click for python code\u00a0<\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/KNIME_project17.knwf_.zip\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2202\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_association-300x172.png\" alt=\"\" width=\"300\" height=\"172\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_association-300x172.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_association-768x441.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_association-1024x588.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_association.png 1236w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/KNIME_project17.knwf_.zip\">click for knime workflow<\/a><\/p>\r\n<p><a href=\"https:\/\/www.kaggle.com\/c\/instacart-market-basket-analysis\">Homework : Link for Kaggle, instacart<\/a><\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 9 (Apr 16): Concept of Error and Evaluation Techniques<\/strong><br \/>n-Fold Cross Validation , LOO, Split Validation<br \/>RMSE, MAE, R2 values for regression<br \/>RandIndex, Silhouet, WCSS for clustering algorithms<br \/>Accuracy, Recall, Precision, F-Score, F1-Score etc. for classification algorithms<\/p>\r\n<p>We also got an introduction to dimension reduction with PCA (principal component analysis) and Neural networks with MLP (multi layer perceptron)<\/p>\r\n<p>Please don&#8217;t forget to install Keras for next week.<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_rf.knwf_.zip\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2208\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-46-300x204.png\" alt=\"\" width=\"300\" height=\"204\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-46-300x204.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-46-768x521.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-46-1024x695.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-46.png 1580w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_randomforest.py_.zip\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2209\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-03-300x159.png\" alt=\"\" width=\"300\" height=\"159\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-03-300x159.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-03-768x407.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/Screenshot-at-Apr-16-19-20-03-1024x542.png 1024w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 10 (Apr 23): Collective Learning <\/strong>:\u00a0<\/p>\r\n<p>This content has moved to previous week because of the holiday<\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 11 (Apr 30): Collective Learning and Consensus Learning and Clustering Algorithms:\u00a0<\/strong>Ensemble Learning, Bagging, Boosting Techniques, Random Forest, GBM, XGBoost, LightGBM<\/p>\r\n<p>Some links useful for the class:<\/p>\r\n<ul>\r\n<li>Understanding the Boosting with a simple Decision tree:\u00a0<a href=\"https:\/\/towardsdatascience.com\/boosting-algorithm-gbm-97737c63daa3\">https:\/\/towardsdatascience.com\/boosting-algorithm-gbm-97737c63daa3<\/a><\/li>\r\n<li>Simplified version of GBM coding and visualization:\u00a0<a href=\"https:\/\/medium.com\/mlreview\/gradient-boosting-from-scratch-1e317ae4587d\">https:\/\/medium.com\/mlreview\/gradient-boosting-from-scratch-1e317ae4587d<\/a><\/li>\r\n<li>Kaggle Entry for the same GBM story (Also holds the scratch codes of DecisionTree class):\u00a0<a href=\"https:\/\/www.kaggle.com\/grroverpr\/gradient-boosting-simplified\/\">https:\/\/www.kaggle.com\/grroverpr\/gradient-boosting-simplified\/<\/a><\/li>\r\n<li>If you are curious about the splitting point and the std_agg or var_split functions :\u00a0<a href=\"https:\/\/towardsdatascience.com\/random-forests-and-decision-trees-from-scratch-in-python-3e4fa5ae4249\">https:\/\/towardsdatascience.com\/random-forests-and-decision-trees-from-scratch-in-python-3e4fa5ae4249<\/a><\/li>\r\n<\/ul>\r\n<p>Readings and resources:\u00a0<\/p>\r\n<ul>\r\n<li>XGBoost Algorithm :<a href=\"https:\/\/xgboost.ai\">\u00a0https:\/\/xgboost.ai<\/a><\/li>\r\n<li>The very early resource for the XGBoost:\u00a0<a href=\"http:\/\/xgboost.readthedocs.io\">xgboost.readthedocs.io<\/a><\/li>\r\n<\/ul>\r\n<p>Python Codes from the class :<\/p>\r\n<p>Gradient Boosting:<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/abu_boosting.py_.zip\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2215\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/gbm_abu-300x161.png\" alt=\"\" width=\"300\" height=\"161\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/gbm_abu-300x161.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/gbm_abu.png 640w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p>XGBoost (for running the code install XGBoost by the command prompt:\u00a0<\/p>\r\n<p>conda install -c conda-forge xgboost<\/p>\r\n<p>Install XGBoost extension for Knime<\/p>\r\n<p><a href=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/knime_xgboost.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-2218\" src=\"http:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/knime_xgboost-300x116.png\" alt=\"\" width=\"300\" height=\"116\" srcset=\"https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/knime_xgboost-300x116.png 300w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/knime_xgboost-768x297.png 768w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/knime_xgboost-1024x396.png 1024w, https:\/\/sadievrenseker.com\/wp-content\/uploads\/2019\/02\/knime_xgboost.png 1568w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\r\n<p>&nbsp;<\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td><strong>Week 12 (May 7): Project Presentations First Group. <\/strong><br \/>Presentations will be picked randomly during the class and anybody absent will be considered as not presented. <br \/>Project Deliveries (until May 6): Project Presentation, Project Report (explaining your project, your approach and methodologies, difficulties you have faced, solutions you have found, results you have achieved in your projects, links to your data sources). Knime Workflows (in .knwf format) and python codes (in .py format). <strong>Please <\/strong>make all these files a single .zip or .rar archive and do not put more than 4 files in your archive.<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\r\n<p><strong>Week 13 (May 14): Project Presentations Second Group<\/strong><\/p>\r\n<p>If you haven missed the project presentations in the first week, please contact me for further details.\u00a0<\/p>\r\n<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\u00a0<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\u00a0<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>\u00a0<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n","protected":false},"excerpt":{"rendered":"<p>Antalya Science University Course Name: Introduction\u00a0to\u00a0Data Science Course Code:\u00a0CS 447 Language of Course:\u00a0English Credit:\u00a03 Course Coordinator \/ Instructor:\u00a0\u015eadi Evren \u015eEKER Contact:\u00a0intrds@sadievrenseker.com Schedule:\u00a0Tuesday 15.00 &#8211; 18.00 Course Description: \u00a0This course is an introduction level course to data science, specialized on machine learning, artificial intelligence and big data. The course starts with a top down approach to data science projects. The first &hellip; <a href=\"https:\/\/sadievrenseker.com\/?p=2158\">Continue Reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-2158","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=\/wp\/v2\/posts\/2158","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2158"}],"version-history":[{"count":20,"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=\/wp\/v2\/posts\/2158\/revisions"}],"predecessor-version":[{"id":2175,"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=\/wp\/v2\/posts\/2158\/revisions\/2175"}],"wp:attachment":[{"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2158"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2158"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sadievrenseker.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2158"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}