Can you share a little bit about your background and what motivated you to start a company focused on neural network compression?
We’re also running Tinify, which has been compressing images for a long time already – for over 10 years. This is a very simple solution. All it takes is one click. We drew inspiration from that as we were looking for more innovative ways to help out in the world, and we thought that we could try compressing all the new AI models.
The AI world is still in its beginning phase, and with that, people who built AI models are programmers, developers, and machine learning engineers. But I expect that this will change in the future, and everyone will be able to make an AI model.
So, we want to make that as easy as possible – both for the experts and the non-experts.
Were there any specific challenges or gaps in the AI industry that you identified which led you to the inception of AIminify?
Yes – simplicity.
There are open-source tools available for compressing, but users have to fill in 80 parameters to do it! We realized that this takes up too much time, and people shouldn’t have to work so hard to compress a model successfully.
When did you first become interested in artificial intelligence and neural network models?
Looking back, I was always interested in AI models, but I was not aware that they were AI models.
My background is in different manufacturing plants, and there were always AI models in there, especially around the quality systems, but I was totally unaware of how they worked at the time. For me, I think that the big trigger was ChatGPT coming up and all the hype surrounding it.
What sets AIminify apart from other companies with similar types of services?
What we have is 10 years of experience with a user-first approach. This means that if we have to do 10 times more work to make things simpler for the user, that’s what we’ll do. And we’ve been doing things like that for over 10 years now, so that’s our DNA.
When people want to compress neural networks, they don’t want to be bothered and spend too much time on it because that’s not their job – their job is to just make a functional AI model.
How do you see the demand for compression solutions evolving in the AI industry?
It’s evolving rapidly.
As I mentioned, there are currently machine learning and data engineers building AI models and neural networks. But in the near future, there will be less technical people building AI models as well.
How do you address concerns related to maintaining model performance while achieving compression?
That’s a big thing. We’ve learned from interviews and talks with users that the people who build AI models have special requirements.
So, we decided to give them 5 options, from really aggressively compressed to hardly anything compressed. Instead of having to adjust the parameters, they just have to click on the level they want, and that’s it.
Users can play around with these levels, seeing how the model functions with the different compression levels. Then, they can decide what’s best for their specific use case.