Integrate AI Into Any Application in Minutes

Original article can be found here (source): Artificial Intelligence on Medium

In this new paradigm, don’t waste time building and deploying, instead focus on your data and use-case. Utilize existing Spawner building blocks or hand your model off to Spawner to handle deployment, serving and infrastructure.

We want the software engineer to be able to integrate Spawner into their client’s application in minutes. We want the Data Scientist or ML Engineer to be able to augment their current workflow with Spawner in seconds. We want to handle the overhead and infrastructure so you can focus on what really matters — delivering value to your project, business, and customers!

/answer — one example of what the Spawner API can do…

You can think of the Spawner API as a set of legos (each lego representing some Machine Learning function) that you can piece together to build any sort of application. These endpoints expose pieces of our API so that anyone with an API key can piece together this functionality.

Piece together the building blocks

/clean — use this endpoint to clean your data (in our current build we support cleaning text for use in NLP) ex. “What is#!!@ tHe revENue of aPPle?” → “what is the revenue of apple”

/understand — use this endpoint to understand the key words in a statement, phrase, paragraph, etc… ex. “what is the revenue of apple?” → [revenue, apple]

/answer — use this endpoint to take the keywords and turn them into insights; this endpoint mines a massive data store and finds you the answer to any question asked in Natural Language. ex. “what is the revenue of apple?” → “The totalRevenue of Apple Inc. (AAPL) is 91722000000.0”

Since it’s a JSON we can work without it however we need. Let’s just grab the natural language responses for this query!

Here are some templates in Python, R, and More!

We built a bunch of open-source templates so you can get started with the Spawner API right away!

We started with some use-cases we are most familiar with. As a team of daily practitioners of NLP, as well as quants/stock traders, we’re first delivering endpoints to NLP and finance.

As you’re likely well aware by now —you can use our API in any programming language you want. Here are some templates to get you started.

Spawner + Python/Streamlit

At Spawner, we love Streamlit. The value it brings to Data Science and the Python community at large is massive.

Here’s an example of our existing template you can try out.

In this example, ask (almost) any financial question and get back an answer! In this case, we’re asking for the P/E Ratio of Apple. We follow it up with a question about the revenue of Apple.

We can easily use the requests library to integrate an endpoint.

x = requests.post(url, data=json.dumps(data), headers=headers) 
Testing NLP and Finance Endpoints…

In this next example we showcase our backtester. This uses the /backtest endpoint that you can integrate into any of your trading applications. Use /backtest to quickly input any tickers or stock names and get the cumulative performance of that portfolio. You’ll also get the performance of each stock over that timeframe. Currently supports total return, CAGR, Sharpe Ratio, Calmar Ratio, and max drawdown.

Python + Jupyter Notebooks

We threw some our endpoints into a Jupyter Notebook.

You can check that out here.

Python (Flask) + HTML/CSS/JS

HTML/CSS/JS is still the bread and butter of web applications, so we threw something together that shows off our endpoints in a clean and easy to use web application.

In Closing

We’re super stoked about what we’ve built. We’re even more excited about seeing what our users can come up with. We see a future with dozens of endpoints across many different domains. The public interface for AI awakens.

We hope you’ll join us on this journey!