AI News sat down with Piero Molino, CEO and co-founder of Predibase, during this year’s AI & Big Data Expo to discuss the importance of low-code in machine learning and trends in LLMs (Large Language Models).
At its core, Predibase is a declarative machine learning platform that aims to streamline the process of developing and deploying machine learning models. The company is on a mission to simplify and democratise machine learning, making it accessible to both expert organisations and developers who are new to the field.
The platform empowers organisations with in-house experts, enabling them to supercharge their capabilities and reduce development times from months to just days. Additionally, it caters to developers who want to integrate machine learning into their products but lack the expertise.
By using Predibase, developers can avoid writing extensive lines of low-level machine learning code and instead work with a simple configuration file – known as a YAML file – which contains just 10 lines specifying the data schema.
Predibase reaches general availability
During the expo, Predibase announced the general availability of its platform.
One of the key features of the platform is its ability to abstract away the complexity of infrastructure provisioning. Users can seamlessly run training, deployment, and inference jobs on a single CPU machine or scale up to 1000 GPU machines with just a few clicks. The platform also facilitates easy integration with various data sources, including data warehouses, databases, and object stores, regardless of the data structure.
“The platform is designed for teams to collaborate on developing models, with each model represented as a configuration that can have multiple versions. You can analyse the differences and performance of the models,” explains Molino.
Once a model meets the required performance criteria, it can be deployed for real-time predictions as a REST endpoint or for batch predictions using SQL-like queries that include prediction capabilities.
Importance of low-code in machine learning
The conversation then shifted to the importance of low-code development in machine learning adoption. Molino emphasised that simplifying the process is essential for wider industry adoption and increased return on investment.
By reducing the development time from months to a matter of days, Predibase lowers the entry barrier for organisations to experiment with new use cases and potentially unlock significant value.
“If every project takes months or even years to develop, organisations won’t be incentivised to explore valuable use cases. Lowering the bar is crucial for experimentation, discovery, and increasing return on investment,” says Molino.
“With a low-code approach, development times are reduced to a couple of days, making it easier to try out different ideas and determine their value.”
Trends in LLMs
The discussion also touched on the rising interest in large language models. Molino acknowledged the tremendous power of these models and their ability to transform the way people think about AI and machine learning.
“These models are powerful and revolutionizing the way people think about AI and machine learning. Previously, collecting and labelling data was necessary before training a machine learning model. But now, with APIs, people can query the model directly and obtain predictions, opening up new possibilities,” explains Molino.
However, Molino highlighted some limitations, such as the cost and scalability of per-query pricing models, the relatively slow inference speeds, and concerns about data privacy when using third-party APIs.
In response to these challenges, Predibase is introducing a mechanism that allows customers to deploy their models in a virtual private cloud, ensuring data privacy and providing greater control over the deployment process.
As more businesses venture into machine learning for the first time, Molino shared his insights into some of the common mistakes they make. He emphasised the importance of understanding the data, the use case, and the business context before diving headfirst into development.
“One common mistake is having unrealistic expectations and a mismatch between what they expect and what is actually achievable. Some companies jump into machine learning without fully understanding the data or the use case, both technically and from a business perspective,” says Molino.
Predibase addresses this challenge by offering a platform that facilitates hypothesis testing, integrating data understanding and model training to validate the suitability of models for specific tasks. With guardrails in place, even users with less experience can engage in machine learning with confidence.
The general availability launch of Predibase’s platform marks an important milestone in their mission to democratise machine learning. By simplifying the development process, Predibase aims to unlock the full potential of machine learning for organisations and developers alike.
You can watch our full interview with Molino below:
Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The event is co-located with Digital Transformation Week.