What’s the predictive value of a new data set? If you augment an existing model with a new data...
AI for Developers: Bringing AI into every application
One of the challenges for developers who want to get into AI is that it's a substantial undertaking. A whole new set of tools to learn is available. It's hard to imagine "well, what is a neural network good for?" and "how would I train something on this data set?" and "when do I stop training and start deploying?"
Tools such as ChatGPT are great for programming when you have a specific task and are not familiar with a popular API. It can be difficult to drive ChatGPT to an exact solution when the problem statement is non-trivial, but it’s often quite useful for a first sketch of a feature or an interaction with a library. I’ve used it many times to help me with matplotlib, to help me with things in React, to write some animation code in JavaScript, and more. It has been useful for me when debugging PyTorch tensor issues as well.
In the future, software development will be more about systems design and less about code minutiae, and that’s probably a good thing for everyone.
As developers, it's great to have these co-pilot assistants for the development process, whether we are working with existing code or generating new code. But what about bringing AI into your applications? What if we want to predict house prices, when a piece of equipment is going to fail, or which customer to call next?
Well, check out this schematic:
There are two main touch between you and Featrix: first, the training data coming in, and second, your application making predictions by calling Featrix. When you use Featrix on your data, Featrix uses deep learning techniques to build a foundational model on your data. Featrix then trains specific task functions that we call neural functions. All of these capabilities are powered by multiple neural networks.
What is a neural network?
Neural networks are functions: they have an input, they do something, and there is an output. The catch is: we don't know what the function should actually do on the input to produce the outputs. To characterize the behavior we want, we need a set of example data inputs and data outputs. When we train the neural network, we are asking the computer to figure out a way to understand the relationships in that data such that we can make predictions of what the output would be on new inputs that we do not have in our training data.
A neural network is a trainable function.
What can I do with neural networks?
As developers, we can do quite a bit. When we use a neural network, we call that prediction or inference. We can predict numerical values, such as when a component in a system might fail. We can predict the category of an object, such as whether a toy is a Lego or not. We can predict which item in a list the user is most likely to click on, or which item the user is least likely to have heard of.
The three categories of prediction are called regression (numerical outputs), classification (a categorical output), and a recommendation (picking something from a set). In all cases, we compute probabilities. The underlying network might be quite complicated (many layers and nodes) or it could be simple (fewer layers and nodes). Indeed there is a whole craft (science + art) of picking a network size and adjusting various parameters that are called hyperparameters.
There’s a catch in preparing the data itself: the neural network needs the data represented as numbers. If my data contains strings, whether they are long strings or short string labels, we have to encode the data. We might be missing values and we have to figure out what to do with those values–some people may drop the records from the data with missing values, other techniques include putting a synthetic value in–and these approaches have significant tradeoffs.
Oftentimes there are “hidden” features in the data. Perhaps we care about the day of the week when we have a bunch of timestamps–maybe we’re trying to predict the flow of retail customers at a grocery store. A different business, such as a restaurant, might have those same timestamps but care more about time of day more than day of week.
Data preparation limits AI teams
That’s one of the key reasons we built Featrix. For nearly every team working on predictive AI, the complexity of preparing data is so high that the team is vastly limited in what they can achieve.
The tradeoffs described above have statistical impact on the questions being asked and the predictions that answer those questions–this means a statistical background is often helpful to understand these tradeoffs. But most developers don’t have a statistical background.
How do we build higher?
Steve Jobs in the late 90s said that, as a developer, you can probably build a few “floors” of a building before it falls in on you. And so the key to building higher with ease is to start on a higher floor. His argument was that if Apple provided APIs to enable you to build from the 20th floor instead of the second floor, you can skip the steps required to get to the 20th floor and just focus on your application.
We agree. So we aim to give developers a big head start on all the other AI toolkits and runtimes out there. We simplify the problem: you provide inputs and Featrix will construct what we call neural functions. There’s a ton of deep learning AI and statistical tools we leverage to build the best possible neural functions on your data–and you don’t have to know anything about any of it. You don’t have to clean or process your data. You just give it to Featrix and let it do its thing.
Q&A
-
How do I try Featrix?
You can sign up and give it a spin today. We include several demos and public data sets to experiment with. -
What languages can I use with Featrix?
We provide Python and JavaScript APIs out of the box. Underneath, we're HTTPS so we can build bindings in any language. Support for other languages is coming. If you'd like to work with something else, drop us a note at hello@featrix.ai. -
How does Featrix deal with missing values or noisy values?
Featrix builds what's called an embedding space as a foundational model on your data. We transform your input into vectors via dynamic encoding that is both able to deal with missing values without changing the underlying distribution and can leverage the mutual information across columns such that values that mean the same thing but are represented differently end up in similar places in the embedding space. Our neural functions are models trained on the embedding space's transformation, which means they tend to be fairly straightforward models, since the embedding space takes care of the input transformation.
The good news is that you don't need to understand any of that in a deep way to be productive with Featrix. -
Do I need to clean my data first?
Nope. When we started Featrix, we did extensive customer interviews. What we heard over and over was that data preparation is a huge stumbling block for even the biggest and most experienced data science teams. So we designed Featrix specifically to eliminate data preparation from the mix. As we've done more work, we've found that not only does our approach mean you do not clean the data, but it is actually preferable not to. The problem with cleaning data is that you remove subtle signal--there may be a good reason something is missing; slightly different strings may indicate the rows came from different sources, and so on--and cleaning the data of this variety may erase meaningful structure that could impact your AI built on this data! -
My data is very special and requires me to be special with it.
Statistically speaking, this is extremely unlikely, though a lot of people think this. Automation is coming for many of the most boring things in the world, and data preparation is a key candidate. With an automated approach, we think developers will be able to do far more with predictive AI--bringing together more data sets and building more customization than ever before. And with the ability to extract structured data from unstructured, the amount of data all of us are going to want to apply predictions on is rapidly increasing by an order of magnitude.
In other words, we don't think anyone will be bored any time soon! -
Can I run Featrix in my private cloud or otherwise in my perimeter?
Yes, we ship private Docker instances with an enterprise license; you can run this with docker compose or in a Kubernetes environment. Contact hello@featrix.ai to discuss.