Difference between revisions of "Category:ML Server"

From Interaction Station Wiki
Jump to navigation Jump to search
 
(15 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{warning|2=We are currently writing this category. Things may change!}}  
+
{{info|1=We are currently in the process of writing this category. Articles are unfinished and may change!}}  
 +
= Introduction =
 +
 
 +
At the interaction station you have the ability to run various machine learning models on the station's server. Our server is hosting a service named [https://localai.io/ LocalAI] which can be used as a drop-in replacement and is compatible with OpenAI's API specification. It allows you to use large language models (LLMs), transcribe audio, generate images and generate audio. The only thing you need to know is how to request the server what model to run so that the server can give you a response. Using the [[:Category:Python|Python]] programming language, this tutorial will walk you to the process. To follow along, it is advised that you have some understanding of Python.
 +
 
 +
As mentioned before, LocalAI is compatible with OpenAI's API specification. That means you can also read the [https://platform.openai.com/docs/api-reference/introduction OpenAI's API Reference] for more information. The pages in this category borrows heavily from their documentation.
  
Interaction Station Machine Learning Server
 
 
[[Category:AI & Machine Learning]]
 
[[Category:AI & Machine Learning]]

Latest revision as of 15:43, 23 September 2024

Warning
Info:
We are currently in the process of writing this category. Articles are unfinished and may change!

Introduction

At the interaction station you have the ability to run various machine learning models on the station's server. Our server is hosting a service named LocalAI which can be used as a drop-in replacement and is compatible with OpenAI's API specification. It allows you to use large language models (LLMs), transcribe audio, generate images and generate audio. The only thing you need to know is how to request the server what model to run so that the server can give you a response. Using the Python programming language, this tutorial will walk you to the process. To follow along, it is advised that you have some understanding of Python.

As mentioned before, LocalAI is compatible with OpenAI's API specification. That means you can also read the OpenAI's API Reference for more information. The pages in this category borrows heavily from their documentation.

Pages in category "ML Server"

The following 4 pages are in this category, out of 4 total.