Overview of topics

All of the knowledge

In the wiki, we dive into visionAI concepts of a wide variety of topics. The list will be ever-growing, but below you find our attempt of putting it all into a meaningful structure.

Use the menu on the left-hand side to navigate through the individual terms and categories. The menu on the right-hand to scroll through the different sections of an article.

Model families 👾

In this section, we cover the most used model families in visionAI today. Each model family solves one specific type of problem in computer vision. We will describe the respective task for each model, give a brief overview of how these models work, and link other helpful information like which metrics should be used or the most groundbreaking papers in this model family.

And yeah, we know that transformers are a thing, but they not viable for production yet. This is why we're not including them for now.

Model architectures 🏛

Most of the fancy papers published are about model architectures. Of course, the range of model architectures out there is endless, so we'll focus on the most commonly used ones and the ones available in Hasty's model zoo. Whenever we add a model there, we'll also add it to the wiki.

If there's an architecture you're passionate about but not present in the wiki yet, please go ahead and add it.

Metrics 📊

Is it SOTA? Metrics are used to measure the performance of a model. Each model family and use case requires different metrics. We discuss which metric you should use for which use case, provide you benchmarks, an intuition behind each metric, and small code snippets to calculate them.

Loss 📉

Loss is the number we always want to see going down, and ideally, converging to zero. The loss function is the target function which a neural network minimizes. It is typically some disparity between a model's prediction and the ground truth data. Different loss functions are suitable for different use cases. In the wiki, we'll explore which loss functions you should use depending on your task and how to compute them.

Solver / Optimizer 🧮

Solvers, also called optimizers, are the algorithms that navigate you through the loss landscape and converge to the minimal loss of your model. We'll cover the most commonly used ones and dive deeper into the hyper-parameters which you can set here and how they influence your model.

Scheduler ⏰

The algorithms controlling the optimizer ramp up. Concretely, they can modulate the learning rate of the optimizer during training.