The end of the winter semester is (finally) here same as the end of my semester project. What I’ve been trying to do whole time – to write about my work as understandable as I could. The main task was to get to know an artificial intelligence better. Rather, only part of it due to a broad range of what the AI is. The machine learning is one approach and the neural network is the next one among the others. The machine learning knows several tools and ways to solve problems. A common problem is to classify some data and that was also my goal. To classify bunch of samples from the dataset using different machine learning methods was a good way how to explain them. I wrote 11 blog posts, because I had to I found it quite interesting. Now, I would like to summarize all of them and put them in the order. I added a few explanatory words to each post, because their titles sometimes don’t make sense even to me.

  1. In Vino Veritas, Artificial Intelligence Approves – At the beginning, there is an explanation of what the AI is and what the AI is not. My task for the semester project was introduced as well as my motivation to do it. I was full of expectations.
  2. From the Root through the Branches to the Leaves – The first encounter with MATLAB, the first machine learning classifier, the first results. An exemplar script is written here. Everything worked well so far…
  3. Lose Way and Hope in a Forest of a Binary Trees – This time, the classification trees were created with optimized hyperparameters. To expect the better results is obvious, but also incorrect as MATLAB shows.
  4. An Enhanced Neighbor Is Better than any Tree – The second classifier, I put steps from post 2 and 3 together. Despite the simplicity of the k-nearest neighbor algorithm, the results were better.
  5. The Naive Probability Is the Expert on Wine Quality – The classification model based on the probabilities has proven very well. I tried hard to explain why.
  6. The Wine Dataset Is Divided by Support Vector Machines – The next classifier was support vector machine. The hyperparameter optimization again lasted so long, but I didn’t complain anymore, I got used to it.
  7. Discriminant Analysis Triumphantly Ends Comparison of Classifiers – The last machine learning classifier and also the best one. In this post, all classifiers are compared in the graph as well as in the tables. The project conclusion is already written here.
  8. Neural Networks, the All-Purpose Tool? – The neural networks have a special place among the artificial intelligence tools. I simply couldn’t skip them.
  9. Solving Another Problem, That’s What AI Does – Just in case if there was too much wine (although there’s never enough wine), here is a new problem to solve.
  10. Now I Know My Grades Before Taking Exams – This wasn’t even part of my semester project, I only desired to predict the future and lose an exam anxiety.
  11. My Farewell and What Was Going on Here? – A self-reference link without a purpose, just to have all posts neatly together.

My recommendation is to read these posts in order, because there might be links to the previous ones. I suppose I should probably say some final words. What I got from this semester project – a lot, a good insight into the AI, some skills too. What it took form me – a lot of time, but I’m not complaining (at least not out loud).

ImDone-1


0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *