All standard workshops can be adapted to your specific needs - just reach out to clarify the details
Half-day introduction to basic knowledge on AI, machine learning and use-cases.
Primariliy targeting management with tight schedules that need a quick grasp on the topic
Full day introduction to AI. The goal of this day is to enable participants to define requirements for AI projects, find the right data-sources and challenge/steer internal or external implementation providers.
AI for Developers
Multi day deep dive for IT, developers and data science staff. Based on detailed Python examples and working assignements you will learn all the basics from loading and cleaning your data to training sophisticated machine learning models.
We are happy to offer you a tailormade selection of the content modules or prepare special requests for you.
All content modules can be combined to individual workshops - we will be happy to include specific content and use-cases for your needs
Is it artificial intelligence or machine learning? What is intelligence in the first place? And what is deep learning?
How does everything connect to big data, Industry 4.0 and Robotics
In this module we clear up some of the most common buzz words and set the context for the whole workshop
You know it when you see it - which is why we discuss some well known use cases from AlphaGo to self driving cars in the beginning of the workshop to give you a feeling what the frontier of AI research feels like.
While we are currently experiencing a great hype around AI the field itself has quite a history. We explain some of the key moments and discuss what was regarded to be AIin the 80s and 90s
This module in our management introduction focuses entirely how to align your AI efforts with existing company strategy. Do you need an AI strategy or rather a strategy incorporating AI? Do you need an CAIO? Should you start quick with external help or focus more on education of existing teams?
This module is one of the core elements to all of our workshops. You will learn what a machine learning model is on an abstract level. Then you we will explore some specific models and (except in the intro workshop) find our way to neural networks. You will understand the relationship between available data and feasible model complexity.
Of course every great technology trend comes with risks attached. What are adversarial examples and how can we attackan AI with them? What can go wrong when integrating AI models into production environments? How does bias in my training data affect the model results and what ethical aspects should you pay attention to when applying artificial intelligence in your company?
One key skill in an AI driven world will be the ability to translate between increasingly complex business use-cases and state of the art machine learning approaches. In this hands-on workshop section we will focus how to define crisp and clear input and output formats and how to articulate our requirements for the artificial intelligence outcome.
Deep neural networks are at the core of the current exponential developments in artificial intelligence. But how do they work? What are artificial neurons? And what is the link between the very basic machine learning models we saw previously in the workshop and those highly complex models?
Cloud services are easy to use at a low cost. But do you know what data was used to train the service? Is there any bias in the cloud based machine learnin model? How can you make sure that the service satisfies your project requirements?
In this section we will explore several cloud offerings and discuss what potential down-sides of cloud based AI solutions are and how to mitigate them.
This module is meant for developers and IT professionals who already know other programming languages but need a quick start in Python.
We will also introduce the interactive Jupyter notebooks we use in the workshop and guide your through some core principles you need to know for the upcoming interactive training.
There are many exceptionally good libraries and frameworks to help you with machine learning in Python. Three of the most important ones to start are Pandas, Numpy and Scikit learn. Together we will walk through the core principles of all three packages and then you will practice all steps hands-on with classic machine learning datasets like IRIS and MNIST.
We will explore the two most basic forms of machine learning(in this case rather basic maths) and lay the foundation for understanding the concept of training a model. Furthermore we will learn how data can be fed into a model and how classification works - i.e. the output we want from the AI is a class (dog/cat/mouse)
What can go wrong when you train a highly complex model with very few data? How can we detect this effect and what can we do to mitigate this so called overfitting?
Now that we have understood all basic concepts from training linear and logistic regression to overfitting and regularization we are ready to explore a set of classicmachine learning models.
Although this type of classifier is very differnt to the approaches we have learned so far you have to understand the unreasonable effectiveness of naive bayes classifiers and how the are applied to email texts in order to have a full overview of the classic machine learning landscape.
In this demo we will walk through a neural network based anomaly detection method based on autoencoders. You will understand the power of the method and what the prerequisites are to applying it in your use-cases.
One of the most common unsupervised methods in machine learning are clustering algorithms. We will introduce you to several concepts from k-means clustering to DBSCAN and walk you through the pros and cons of the different approaches depending on your dataset.
This module will greatly help you in data exploration and visualization.
Sparse datasets with a very high dimensionality have several problems. The most obvious one is that we cannot visually inspect the data anymore. Further problems include that concepts like nearest neighbors based on distance break down and it can become incredibly hard to train machine learning models with the data.
Luckily there are several methods to reduce the dimensions of a given dataset - in this section we will explore three: PCA, t-SNE and UMAP
We had the pleasure to work with teams of the following companies - of course we cannot share any project details but in case you need a reference will will be happy to make an introduction
We have helped several hundred participants with diverse backgrounds - from HR/Sales/Marketing to IT/Developers/System architects - to master the first steps in AI. We would be happy to welcome you among our clients
We are looking forward to working with you