CHATGPT IN BUSINESS USE – CREATING FREEDOM THROUGH LARGE LANGUAGE MODELS
In-person events in Wiesbaden or online seminar on 1 day: €1,090 per person (net)
How can large language models be used in everyday business to automate routine tasks? In this seminar, we'll demonstrate this using ChatGPT from open.ai, certainly one of the best LLMs currently on the market.
Dates for Open Training Crash Course: 29.02.2024, 11.10.2024

LEARNING OBJECTIVES AND AGENDA
Goals :
LLM basics and first steps in the open.ai portal
- open.ai API
- Prompt Engineering
- Using your own information in LLM: Option 1: In-context data
- Using your own information in the LLM: Option 2: Retrieval-Augmented Generation, RAG
- Using your own information in the LLM: Option 3: Fine Tuning
- Using other LLMs
IN-HOUSE SEMINAR
Seminars can be held at the customer’s location or online
€1,390.00
per day up to 4 participants plus statutory VAT
All content of the in-house seminars is individually tailored and taught to specific target groups .
Intensive follow-up support enables participants to implement their knowledge in the shortest possible time.
Recommended seminar duration: 1-1.5 days
Rental fees for training notebook (on request): 60,- Euro (per day, per training computer)
WORKSHOP
You tell us your topics!
Price on request
plus statutory VAT and travel expenses if applicable
All workshop content is individually tailored and taught to specific target groups .
We are happy to conduct the workshop at your location, in Wiesbaden or online.
Rental fees for training notebook (on request): 60,- Euro (per day, per training computer)
Module 1: LLM Basics and First Steps in the open.ai Portal
In this module, you'll learn about the interface of open.ai's ChatGPT. Using practical examples, we'll explain the basics of Large Language Models (LLMs). You can then test LLMs directly in the open.ai portal.
Module 3: Prompt Engineering
Prompt engineering refers to the skillful adaptation of the questions sent to the large language models to obtain optimal results tailored to their needs. For example, the LLM can first be informed that it is a clerk tasked with managing customer master data. Next, requests for changes to master data arrive via email. The AI is then tasked with determining what the customer wants—e.g., changing the IBAN—and then extracting the IBAN.
Module 2: open.ai API
ChatGPT is much more conveniently controlled via the Python API. Open.ai provides a package with many useful functions. In this section, we'll create Python code and then control ChatGPT via the API. Other functions, such as counting tokens, calculating costs, etc., will also be covered.
Module 4: Using your own information in the LLM: Option 1: In-context data
Things get interesting when you can also use your own data. An example has already been described here: submitting a customer email with a request to extract the data according to the customer's request.
However, the LLM can also process more extensive information. It depends on the volume of the data: if the data is small (we're talking about several A4 pages in size), the data can be transferred and processed using the prompt.
Module 5: Using your own information in the LLM: Option 2: Retrieval-Augmented Generation, RAG
If more information is required, a two-stage process can be used. In the first step, the texts are processed and stored in a vector database as so-called chunks (text sections of variable length).
A query then searches the vector database using classic information retrieval methods and extracts the relevant chunks. These chunks are then passed along with the query to the Large Language Model, which processes them. This allows for up-to-date information to be processed.
Module 6: Using your own information in the LLM: Option 3: Fine Tuning
To process larger data sets directly in the LLM, fine-tuning is possible. This involves further training the LLM with the goal of adapting the weights of some hidden layers to the new data.
It's important to note that the fine-tuning process is time-consuming. Added to that are the costs involved. Open.ai offers options for this, which are subject to a fee. The data must be converted into an appropriate format. This can be done in Python.
Module 7: Using other LLMs
Open.ai isn't the only company to have developed a large language model, ChatGPT. Numerous other LLMs are available, and the huggingface platform offers a good overview. Llama from Meta and Falcon are certainly worth mentioning. Huggingface also offers tailored models for both models, for example, for German or French. The module provides an overview of the selection of suitable models (license, languages, etc.) and also lists the requirements for deploying or fine-tuning the models.
CONTENTS
Automation creates freedom by having AI take over routine tasks. However, many routine tasks that can be automated are based on relatively unstructured data. This particularly applies to text, such as customer inquiries requesting an address or IBAN change, or travel expense reports, which require information from receipts, invoices, etc.
This is where the relatively new large language models come into play: These, too, are deep learning models. Until now, the hardware simply wasn't available to train the complex models required for human language at a level that would allow AI to interact with them, to independently extract information, summarize text, describe the meaning of text, etc.
But how can large language models be used in everyday business to automate routine tasks? In this seminar, we'll demonstrate this using the example of ChatGPT from open.ai, certainly one of the best LLMs currently on the market. Query automation is achieved using the Python API. This Python API allows ChatGPT to be easily embedded into your own software. Information can then be easily transferred from company databases to the LLM, and the results can then be further processed directly within the systems.



