Ok, everybody is talking about Artificial Intelligence and Machine Learning, interchangeably and most of the time just confusing people. First of all, they are not the same concepts. AI is a very large area covering a bunch of topics and Machine Learning is just one of them.
Artificial Intelligence Vs Machine Learning
There are articles and blogposts out there that clearly state that Machine Learning is an “AI goal or problem”. And machine learning as such can be viewed as a subfield of AI research.
In short Machine Learning is needed to make an intelligent agent adapt to new circumstances and to detect and extrapolate patterns.
As a passionate reader about the subject I can highly recommend Artificial Intelligence: A modern approach by Peter Norvig and Stuart J. Russel.
Another book I still have on my bookshelf since I graduated from KU Leuven university in 2000 is Machine Learning by Tom M. Mitchell… The only book I actually kept from the university and hasn’t been eaten by mice at the attic in the stable at home.
The evolution of Machine Learning
Since 2001 when I graduated as a Master of Science, Computer Science, a lot has changed. Back then Machine Learning was more an academic topic. Finding a job that had something remotely to do with Machine Learning was hard. Definitely because at that time the Euro was ready to be introduced and companies were looking for software engineers everywhere. As a result, my thesis became my last AI achievement for a long time.
Until I started working at ToThePoint where I was convinced to pick up where I left off by Steven Heyninck, Managing Partner of ToThePoint.
What has changed since I left the field? Not “much” actually, except,
- there is more data available,
- there is more computational power and storage,
- there are more frameworks to ease development,
- there are more tools to deploy, govern and monitor your models
Well to be honest, in the field itself I noticed also quite some progress.
For instance: SVM, Support Vector Machine was entirely new to me. I’ve been reading a lot of papers about the algorithm and the oldest papers were written in late 90’s so it makes sense that during my years at university they were given less attention.
Another example given the above factors, deep learning also became possible. When in the past, I was working on a project to recognize handwritten text to classify it as a “female or male handwriting” it took quite a while to train even the simplest neural network. Our procedure was as follows: tweak a parameter in the evening, do a little prayer, go to bed, wake up and hope for the best… repeat.
We didn’t even dare to add a ton of hidden layers just out of fear of missing our project deadline.
All these factors have put Machine Learning back on the hype-map and powerpoint after powerpoint was born.
The Concept of ToTheArcade
At ToThePoint we like to build stuff instead of “powerpointing” people to death. So, what did we do? We gave ourselves a little assignment and a first machine learning concept.
“Build a machine learning pipeline based on real live captured data using IoT and deploy it to a production environment to make realtime predictions”
We have some fun projects going on at ToThePoint and one of them is ToTheArcade. An arcade cabinet built from the ground up stacked with IoT. This enabled us to capture data and find plenty of volunteers to produce the data….
Because, hey, who doesn’t like to play an old school game like Mortal Kombat on an arcade machine?
So, we redefined our assignment and asked ourselves the following:
“Can we train a model to recognize patterns in button and joystick usage on our pimped arcade machine cabinet, deploy it in the cloud and make predictions realtime while somebody is actually playing a random game?”
The answer in short: “Yes, we can!” (hopefully this statement is not patented yet.)
The IoT panel for predictions
We even built an IoT panel in order to visualize the results. This panel shows all kind of information about our predictions, such as prediction relevance and it even enables you to intervene in the prediction process.
Yes, in our PoC, IoT data is given context, turned into wisdom, turned into IoT again. This panel really completes the circle.
I kind of explained why we did it, but in the following blogposts I will explain, how we did it, our lessons learned and our future steps to make our proof of concept even broader.
So stay tuned and follow our social media for some more blogposts diving into our ToTheArcade ML project!
Credits: blogpost by Kevin Smeyers, Machine Learning Master at ToThePoint