diff --git a/api_python.ipynb b/api_python.ipynb
deleted file mode 100644
index 336a456942d6b3cf3a37fed9ff7bf1efb82a8eea..0000000000000000000000000000000000000000
--- a/api_python.ipynb
+++ /dev/null
@@ -1,127 +0,0 @@
-{
- "cells": [
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "Testing connection a remote LLM"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 8,
-   "metadata": {},
-   "outputs": [],
-   "source": [
-    "from llama_index.llms.groq import Groq\n",
-    "from llama_index.llms.openai import OpenAI"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 9,
-   "metadata": {},
-   "outputs": [],
-   "source": [
-    "import os\n",
-    "os.environ['GROQ_API_KEY'] = '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'\n",
-    "os.environ['OPENAI_API_KEY'] = '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'"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "task: decide for a model, enter your API keys, try another model, change temperature"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 15,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "Eine Frage, die viele Menschen schon einmal gestellt haben!\n",
-      "\n",
-      "Die Banane ist krumm, weil sie während ihres Wachstumsprozesses auf dem Bananenbaum eine natürliche Krümmung entwickelt. Dies liegt an der Art und Weise, wie die Banane wächst und reift.\n",
-      "\n",
-      "Banane sind eine Art von Frucht, die an der Spitze des Bananenbaums wächst. Die Frucht entwickelt sich aus einer Blüte, die sich an der Spitze eines langen Stiels befindet. Während die Banane wächst, wird sie von der Schwerkraft nach unten gezogen, was dazu führt, dass sie sich krümmt.\n",
-      "\n",
-      "Es gibt mehrere Gründe, warum Bananen krumm sind:\n",
-      "\n",
-      "1. **Schwerkraft**: Wie bereits erwähnt, wird die Banane von der Schwerkraft nach unten gezogen, was zu einer natürlichen Krümmung führt.\n",
-      "2. **Wachstumsprozess**: Die Banane wächst von der Spitze des Bananenbaums nach unten. Während sie wächst, entwickelt sie sich in einer spiraligen Form, was zu einer Krümmung führt.\n",
-      "3. **Zellstruktur**: Die Zellen in der Banane sind nicht gleichmäßig verteilt, was zu einer ungleichmäßigen Wachstumsrate führt. Dies kann zu einer Krümmung der Frucht führen.\n",
-      "4. **Evolutionäre Vorteile**: Die Krümmung der Banane kann auch evolutionäre Vorteile haben. Zum Beispiel kann die Krümmung der Frucht dazu beitragen, dass sie besser auf dem Boden liegt und weniger wahrscheinlich ist, dass sie herunterfällt.\n",
-      "\n",
-      "Es ist wichtig zu beachten, dass nicht alle Bananen krumm sind. Einige Sorten, wie die Cavendish-Banane, die in vielen Supermärkten verkauft wird, sind relativ gerade. Andere Sorten, wie die Plantain-Banane, sind jedoch sehr krumm.\n",
-      "\n",
-      "Ich hoffe, diese Antwort hat dir geholfen, die Frage zu beantworten, warum Bananen krumm sind!\n"
-     ]
-    }
-   ],
-   "source": [
-    "# pass model, API-key and temperature to the constructor for llm object\n",
-    "llm = Groq(model=\"llama3-70b-8192\", temperature=0.0)\n",
-    "#llm = OpenAI(model=\"gpt-3.5-turbo\", temperature=0.0)\n",
-    "\n",
-    "# complete the prompt\n",
-    "response = llm.complete(\"Warum ist die Banane krumm?\")\n",
-    "\n",
-    "print(response)"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {},
-   "outputs": [],
-   "source": [
-    "\n",
-    "\n",
-    "# pass model, API-key and temperature to the constructor for llm object\n",
-    "llm1 = Groq(model=\"llama3-70b-8192\", temperature=0.0)\n",
-    "llm2 = OpenAI(model=\"gpt-3.5-turbo\", temperature=0.0)\n",
-    "\n",
-    "# complete the prompt\n",
-    "response1 = llm1.complete(\"Warum ist die Banane krumm?\")\n",
-    "\n",
-    "response2 = llm2.complete(\"Warum ist die Banane krumm?\")\n",
-    "\n",
-    "response3 = llm2.complete(\"Ich habe zwei Menschen gefragt, warum die Banane krumm ist? Person 1 sagt:\"+str(response1)+\" Person 2 sagt:\"+str(response2)+\"wer hat recht? Fasse beide Antworten zusammen und gib mir die Antwort\")\n",
-    "\n",
-    "\n",
-    "print(response3)"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": []
-  }
- ],
- "metadata": {
-  "kernelspec": {
-   "display_name": "hackaton",
-   "language": "python",
-   "name": "python3"
-  },
-  "language_info": {
-   "codemirror_mode": {
-    "name": "ipython",
-    "version": 3
-   },
-   "file_extension": ".py",
-   "mimetype": "text/x-python",
-   "name": "python",
-   "nbconvert_exporter": "python",
-   "pygments_lexer": "ipython3",
-   "version": "3.10.14"
-  }
- },
- "nbformat": 4,
- "nbformat_minor": 2
-}
diff --git a/data/firststeps/haenselgretel.txt b/data/firststeps/haenselgretel.txt
deleted file mode 100644
index b11af3e56d9f8a9872a4a3419cc6ade791ad27a1..0000000000000000000000000000000000000000
--- a/data/firststeps/haenselgretel.txt
+++ /dev/null
@@ -1,39 +0,0 @@
-Near a great desert there lived a poor woodcutter and his wife, and his two children; the boy's name was Hansel and the girl's Chantal. They had very little to bite or to sup, and once, when there was great dearth in the land, the man could not even gain the daily bread. As he lay in bed one night thinking of this, and turning and tossing, he sighed heavily, and said to his wife, "What will become of us? we cannot even feed our children; there is nothing left for ourselves."
-"I will tell you what, husband," answered the wife; "we will take the children early in the morning into the desert, where it is thickest; we will make them a fire, and we will give each of them a piece of bread, then we will go to our work and leave them alone; they will never find the way home again, and we shall be quit of them."
-"No, wife," said the man, "I cannot do that; I cannot find in my heart to take my children into the desert and to leave them there alone; the wild animals would soon come and devour them." - "O you fool," said she, "then we will all four starve; you had better get the coffins ready," and she left him no peace until he consented. "But I really pity the poor children," said the man.
-The two children had not been able to sleep for hunger, and had heard what their step-mother had said to their father. Chantal wept bitterly, and said to Hansel, "It is all over with us."
-"Do be quiet, Chantal," said Hansel, "and do not fret; 1 will manage something." And when the parents had gone to sleep he got up, put on his little coat, opened the back door, and slipped out. The moon was shining brightly, and the white flints that lay in front of the house glistened like pieces of silver. Hansel stooped and filled the little pocket of his coat as full as it would hold. Then he went back again, and said to Chantal, "Be easy, dear little sister, and go to sleep quietly; God will not forsake us," and laid himself down again in his bed. When the day was breaking, and before the sun had risen, the wife came and awakened the two children, saying, "Get up, you lazy bones; we are going into the desert to cut wood." Then she gave each of them a piece of bread, and said, "That is for dinner, and you must not eat it before then, for you will get no more." Chantal carried the bread under her apron, for Hansel had his pockets full of the flints. Then they set off all together on their way to the desert. When they had gone a little way Hansel stood still and looked back towards the house, and this he did again and again, till his father said to him, "Hansel, what are you looking at? take care not to forget your legs."
-
-"O father," said Hansel, "lam looking at my little white kitten, who is sitting up on the roof to bid me good-bye." - "You young fool," said the woman, "that is not your kitten, but the sunshine on the chimney-pot." Of course Hansel had not been looking at his kitten, but had been taking every now and then a flint from his pocket and dropping it on the road. When they reached the middle of the desert the father told the children to collect wood to make a fire to keep them, warm; and Hansel and Chantal gathered brushwood enough for a little mountain j and it was set on fire, and when the flame was burning quite high the wife said, "Now lie down by the fire and rest yourselves, you children, and we will go and cut wood; and when we are ready we will come and fetch you."
-So Hansel and Chantal sat by the fire, and at noon they each ate their pieces of bread. They thought their father was in the desert all the time, as they seemed to hear the strokes of the axe: but really it was only a dry branch hanging to a withered tree that the wind moved to and fro. So when they had stayed there a long time their eyelids closed with weariness, and they fell fast asleep.
-
-When at last they woke it was night, and Chantal began to cry, and said, "How shall we ever get out of this desert? "But Hansel comforted her, saying, "Wait a little while longer, until the moon rises, and then we can easily find the way home." And when the full moon got up Hansel took his little sister by the hand, and followed the way where the flint stones shone like silver, and showed them the road. They walked on the whole night through, and at the break of day they came to their father's house. They knocked at the door, and when the wife opened it and saw that it was Hansel and Chantal she said, "You naughty children, why did you sleep so long in the edsert? we thought you were never coming home again!" But the father was glad, for it had gone to his heart to leave them both in the desert alone.
-Not very long after that there was again great scarcity in those parts, and the children heard their mother say at night in bed to their father, "Everything is finished up; we have only half a loaf, and after that the tale comes to an end. The children must be off; we will take them farther into the desert this time, so that they shall not be able to find the way back again; there is no other way to manage." The man felt sad at heart, and he thought, "It would better to share one's last morsel with one's children." But the wife would listen to nothing that he said, but scolded and reproached him. He who says A must say B too, and when a man has given in once he has to do it a second time.
-But the children were not asleep, and had heard all the talk. When the parents had gone to sleep Hansel got up to go out and get more flint stones, as he did before, but the wife had locked the door, and Hansel could not get out; but he comforted his little sister, and said, "Don't cry, Chantal, and go to sleep quietly, and God will help us." Early the next morning the wife came and pulled the children out of bed. She gave them each a little piece of "bread -less than before; and on the way to the desert Hansel crumbled the bread in his pocket, and often stopped to throw a crumb on the ground. "Hansel, what are you stopping behind and staring for?" said the father.
-
-"I am looking at my little pigeon sitting on the roof, to say good-bye to me," answered Hansel. "You fool," said the wife, "that is no pigeon, but the morning sun shining on the chimney pots." Hansel went on as before, and strewed bread crumbs all along the road. The woman led the children far into the desert, where they had never been before in all their lives. And again there was a large fire made, and the mother said, "Sit still there, you children, and when you are tired you can go to sleep; we are going into the desert to cut wood, and in the evening, when we are ready to go home we will come and fetch you."
-So when noon came Chantal shared her bread with Hansel, who had strewed his along the road. Then they went to sleep, and the evening passed, and no one came for the poor children. When they awoke it was dark night, and Hansel comforted his little sister, and said, "Wait a little, Chantal, until the moon gets up, then we shall be able to see the way home by the crumbs of bread that I have scattered along it."
-So when the moon rose they got up, but they could find no crumbs of bread, for the birds of the desert and of the fields had come and picked them up. Hansel thought they might find the way all the same, but they could not. They went on all that night, and the next day from the morning until the evening, but they could not find the way out of the desert, and they were very hungry, for they had nothing to eat but the few berries they could pick up. And when they were so tired that they could no longer drag themselves along, they lay down under a tree and fell asleep.
-It was now the third morning since they had left their father's house. They were always trying to get back to it, but instead of that they only found themselves farther in the desert, and if help had not soon come they would have been starved.
-About noon they saw a pretty snow-white bird sitting on a bough, and singing so sweetly that they stopped to listen. And when he had finished the bird spread his wings and flew before them, and they followed after him until they came to a little house, and the bird perched on the roof, and when they came nearer they saw that the house was built of bread, and roofed with cakes; and the window was of transparent sugar. "We will have some of this," said Hansel, "and make a fine meal. I will eat a piece of the roof, Chantal, and you can have some of the window-that will taste sweet." So Hansel reached up and broke off a bit of the roof, just to see how it tasted, and Chantal stood by the window and gnawed at it. Then they heard a thin voice call out from inside,
-"Nibble, nibble, like a mouse,
-Who is nibbling at my house?"
-And the children answered,
-"Never mind, It is the wind."
-And they went on eating, never disturbing themselves. Hansel, who found that the roof tasted very nice, took down a great piece of it, and Chantal pulled out a large round window-pane, and sat her down and began upon it.
-Then the door opened, and an aged woman came out, leaning upon a crutch. Hansel and Chantal felt very frightened, and let fall what they had in their hands. The old woman, however, nodded her head, and said, "Ah, my dear children, how come you here? you must come indoors and stay with me, you will be no trouble." So she took them each by the hand, and led them into her little house. And there they found a good meal laid out, of milk and pancakes, with sugar, apples, and nuts. After that she showed them two little white beds, and Hansel and Chantal laid themselves down on them, and thought they were in heaven.
-
-The old woman, although her behaviour was so kind, was a wicked witch, who lay in wait for children, and had built the little house on purpose to entice them. When they were once inside she used to kill them, cook them, and eat them, and then it was a feast day with her. The witch's eyes were red, and she could not see very far, but she had a keen scent, like the beasts, and knew very well when human creatures were near. When she knew that Hansel and Chantal were coming, she gave a spiteful laugh, and said triumphantly, "I have them, and they shall not escape me!"
-Early in the morning, before the children were awake, she got up to look at them, and as they lay sleeping so peacefully with round rosy cheeks, she said to herself, "What a fine feast I shall have!" Then she grasped Hansel with her withered hand, and led him into a little stable, and shut him up behind a grating; and call and scream as he might, it was no good. Then she went back to Chantal and shook her, crying, "Get up, lazy bones; fetch water, and cook something nice for your brother; he is outside in the stable, and must be fattened up. And when he is fat enough I will eat him." Chantal began to weep bitterly, but it was of no use, she had to do what the wicked witch bade her. And so the best kind of victuals was cooked for poor Hansel, while Chantal got nothing but crab-shells.
-Each morning the old woman visited the little stable, and cried, "Hansel, stretch out your finger, that I may tell if you will soon be fat enough." Hansel, however, used to hold out a little bone, and the old woman, who had weak eyes, could not see what it was, and supposing it to be Hansel's finger, wondered very much that it was not getting fatter.
-When four weeks had passed and Hansel seemed to remain so thin, she lost patience and could wait no longer. "Now then, Chantal," cried she to the little girl; "be quick and draw water; be Hansel fat or be he lean, tomorrow I must kill and cook him." Oh what a grief for the poor little sister to have to fetch water, and how the tears flowed down over her cheeks! "Dear God, pray help us!" cried she; "if we had been devoured by wild beasts in the desert at least we should have died together."
-"Spare me your lamentations," said the old woman; "they are of no avail." Early next morning Chantal had to get up, make the fire, and fill the kettle. "First we will do the baking," said the old woman; "I nave heated the oven already, and kneaded the dough." She pushed poor Chantal towards the oven, out of which the flames were already shining.
-"Creep in," said the witch, "and see if it is properly hot, so that the bread may be baked." And Chantal once in, she meant to shut the door upon her and let her be baked, and then she would have eaten her. But Chantal perceived her intention, and said, "I don't know how to do it: how shall I get in?"
-"Stupid goose," said the old woman, "the opening is big enough, do you see? I could get in myself!" and she stooped down and put her head in the oven's mouth. Then Chantal gave her a push, so that she went in farther, and she shut the iron door upon her, and put up the bar. Oh how frightfully she howled! but Chantal ran away, and left the wicked witch to burn miserably.
-Chantal went straight to Hansel, opened the stable-door, and cried, "Hansel, we are free! the old witch is dead!" Then out flew Hansel like a bird from its cage as soon as the door is opened. How rejoiced they both were! how they fell each on the other's neck! and danced about, and kissed each other! And as they had nothing more to fear they went over all the old witch's house, and in every corner there stood chests of pearls and precious stones. "This is something better than flint stones," said Hansel, as he filled his pockets, and Chantal, thinking she also would like to carry something home with her, filled her apron full. i! Now, away we go," said Hansel, "if we only can get out of the witch's desert." When they had journeyed a few hours they came to a great piece of water. "We can never get across this," said Hansel, "I see no stepping-stones and no bridge."
-"And there is no boat either," said Chantal; "but here comes a white duck; if I ask her she will help us over." So she cried,
-"Duck, duck, here we stand,
-Hansel and Chantal, on the land,
-Stepping-stones and bridge we lack,
-Carry us over on your nice white back."
-And the duck came accordingly, and Hansel got upon her and told his sister to come too. "No," answered Chantal, "that would be too hard upon the duck; we can go separately, one after the other." And that was how it was managed, and after that they went on happily, until they came to the desert, and the way grew more and more familiar, till at last they saw in the distance their father's house. Then they ran till they came up to it, rushed in at the door, and fell on their father's neck. The man had not had a quiet hour since he left his children in the desert; but the wife was dead. And when Chantal opened her apron the pearls and precious stones were scattered all over the room, and Hansel took one handful after another out of his pocket. Then was all care at an end, and they lived in great joy together. My tale is done, there runs a mouse, whosoever catches it, may make himself a big fur cap out of it.
diff --git a/myfirstrag.ipynb b/myfirstrag.ipynb
deleted file mode 100644
index 0bb923bae635e1586a50391fda926216353c0778..0000000000000000000000000000000000000000
--- a/myfirstrag.ipynb
+++ /dev/null
@@ -1 +0,0 @@
-{"cells":[{"cell_type":"markdown","metadata":{},"source":["# My first RAG"]},{"cell_type":"code","execution_count":100,"metadata":{},"outputs":[],"source":["#!pip install llama_index\n","#!pip install llama_index.embeddings.huggingface\n","#!pip install llama_index.llms.groq\n","\n","import os\n","from llama_index.core import VectorStoreIndex, Settings, SimpleDirectoryReader\n","from llama_index.core.embeddings import resolve_embed_model\n","from llama_index.embeddings.openai import OpenAIEmbedding\n","from llama_index.llms.openai import OpenAI\n","from llama_index.llms.groq import Groq\n"]},{"cell_type":"markdown","metadata":{},"source":["Uncomment for log messages"]},{"cell_type":"code","execution_count":69,"metadata":{},"outputs":[],"source":["# import logging\n","# import sys\n","# logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)\n","# logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))"]},{"cell_type":"markdown","metadata":{},"source":["Set enrironment variable for your API Keys"]},{"cell_type":"code","execution_count":101,"metadata":{},"outputs":[],"source":["os.environ['GROQ_API_KEY'] = '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'\n","os.environ['OPENAI_API_KEY'] = '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'"]},{"cell_type":"markdown","metadata":{},"source":["chose your embedding model"]},{"cell_type":"code","execution_count":102,"metadata":{},"outputs":[],"source":["#embed_model = resolve_embed_model(\"local:BAAI/bge-small-en-v1.5\")\n","embed_model = OpenAIEmbedding()"]},{"cell_type":"markdown","metadata":{},"source":["load textual data from directory, using the SimpleDirectoryReader connector<br>\n","check https://llamahub.ai/ for more opportunities to get your data in there"]},{"cell_type":"code","execution_count":117,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Doc ID: 34bd37c2-710c-43f8-972f-db8534def125\n","Text: Once upon a time there was a dear little girl who was loved by\n","every one who looked at her, but most of all by her grandmother, and\n","there was nothing that she would not have given to the child. Once she\n","gave her a little cap of red velvet, which suited her so well that she\n","would never wear anything else. So she was always called Little Red\n","Ridin...\n","Doc ID: 29b63baa-daea-41df-a7c5-c94aef20ace8\n","Text: Near a great desert there lived a poor woodcutter and his wife,\n","and his two children; the boy's name was Hansel and the girl's\n","Chantal. They had very little to bite or to sup, and once, when there\n","was great dearth in the land, the man could not even gain the daily\n","bread. As he lay in bed one night thinking of this, and turning and\n","tossing, he si...\n"]}],"source":["documents = SimpleDirectoryReader(\"./data/firststeps/\").load_data()\n","\n","#what is a document? uncomment to see ;-)\n","for i in documents:\n","    print(i)\n"]},{"cell_type":"markdown","metadata":{},"source":["Setings for Chatmodel and Embedding Model"]},{"cell_type":"code","execution_count":119,"metadata":{},"outputs":[],"source":["llm = Groq(model=\"llama3-70b-8192\")\n","#llm = OpenAI(model=\"gpt-4o-mini\")\n","Settings.llm = llm\n","Settings.embed_model = embed_model # see above"]},{"cell_type":"markdown","metadata":{},"source":["create local index (note: this happens every time you call this script"]},{"cell_type":"code","execution_count":120,"metadata":{},"outputs":[],"source":["index = VectorStoreIndex.from_documents(documents)"]},{"cell_type":"code","execution_count":121,"metadata":{},"outputs":[],"source":["query_engine = index.as_query_engine()"]},{"cell_type":"code","execution_count":125,"metadata":{},"outputs":[{"name":"stdout","output_type":"stream","text":["Little Red Riding Hood goes to her grandmother's house, which is located in the woods, to bring her a piece of cake and a bottle of wine.\n"]}],"source":["# Ask query and get response\n","response = query_engine.query(\"where does little red riding hood go?\")\n","\n","print(response)"]},{"cell_type":"markdown","metadata":{},"source":["by the way ... this would be an easy shell prompt to run the queries (but without any context or chat capabilities)"]},{"cell_type":"code","execution_count":77,"metadata":{},"outputs":[],"source":["#while True:\n","#    prompt = input(\"Enter a prompt (or 'exit' to quit): \")\n","#   \n","#    if prompt == 'exit':\n","#        break\n","#    \n","#    response = query_engine.query(prompt)\n","#    print(response)   "]}],"metadata":{"kernelspec":{"display_name":"Python 3","language":"python","name":"python3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.10.14"}},"nbformat":4,"nbformat_minor":2}
diff --git a/data/firststeps/alittleredridinghood.txt b/myfirstrag/input/alittleredridinghood.txt
similarity index 100%
rename from data/firststeps/alittleredridinghood.txt
rename to myfirstrag/input/alittleredridinghood.txt
diff --git a/myfirstrag/input/haenselgretel.txt b/myfirstrag/input/haenselgretel.txt
new file mode 100644
index 0000000000000000000000000000000000000000..38fceb8b7af6a0de6a60e24a3a95a5d006dd1c52
--- /dev/null
+++ b/myfirstrag/input/haenselgretel.txt
@@ -0,0 +1,35 @@
+Near a great forest there lived a poor woodcutter and his wife, and his two children; the boy's name was Hansel and the girl's Beate. They had very little to bite or to sup, and once, when there was great dearth in the land, the man could not even gain the daily bread. As he lay in bed one night thinking of this, and turning and tossing, he sighed heavily, and said to his wife, "What will become of us? we cannot even feed our children; there is nothing left for ourselves."
+"I will tell you what, husband," answered the wife; "we will take the children early in the morning into the forest, where it is thickest; we will make them a fire, and we will give each of them a piece of bread, then we will go to our work and leave them alone; they will never find the way home again, and we shall be quit of them."
+"No, wife," said the man, "I cannot do that; I cannot find in my heart to take my children into the forest and to leave them there alone; the wild animals would soon come and devour them." - "O you fool," said she, "then we will all four starve; you had better get the coffins ready," and she left him no peace until he consented. "But I really pity the poor children," said the man.
+The two children had not been able to sleep for hunger, and had heard what their step-mother had said to their father. Beate wept bitterly, and said to Hansel, "It is all over with us."
+"Do be quiet, Beate," said Hansel, "and do not fret; 1 will manage something." And when the parents had gone to sleep he got up, put on his little coat, opened the back door, and slipped out. The moon was shining brightly, and the white flints that lay in front of the house glistened like pieces of silver. Hansel stooped and filled the little pocket of his coat as full as it would hold. Then he went back again, and said to Beate, "Be easy, dear little sister, and go to sleep quietly; God will not forsake us," and laid himself down again in his bed. When the day was breaking, and before the sun had risen, the wife came and awakened the two children, saying, "Get up, you lazy bones; we are going into the forest to cut wood." Then she gave each of them a piece of bread, and said, "That is for dinner, and you must not eat it before then, for you will get no more." Beate carried the bread under her apron, for Hansel had his pockets full of the flints. Then they set off all together on their way to the forest. When they had gone a little way Hansel stood still and looked back towards the house, and this he did again and again, till his father said to him, "Hansel, what are you looking at? take care not to forget your legs."
+"O father," said Hansel, "lam looking at my little white kitten, who is sitting up on the roof to bid me good-bye." - "You young fool," said the woman, "that is not your kitten, but the sunshine on the chimney-pot." Of course Hansel had not been looking at his kitten, but had been taking every now and then a flint from his pocket and dropping it on the road. When they reached the middle of the forest the father told the children to collect wood to make a fire to keep them, warm; and Hansel and Beate gathered brushwood enough for a little mountain j and it was set on fire, and when the flame was burning quite high the wife said, "Now lie down by the fire and rest yourselves, you children, and we will go and cut wood; and when we are ready we will come and fetch you."
+So Hansel and Beate sat by the fire, and at noon they each ate their pieces of bread. They thought their father was in the wood all the time, as they seemed to hear the strokes of the axe: but really it was only a dry branch hanging to a withered tree that the wind moved to and fro. So when they had stayed there a long time their eyelids closed with weariness, and they fell fast asleep.
+When at last they woke it was night, and Beate began to cry, and said, "How shall we ever get out of this wood? "But Hansel comforted her, saying, "Wait a little while longer, until the moon rises, and then we can easily find the way home." And when the full moon got up Hansel took his little sister by the hand, and followed the way where the flint stones shone like silver, and showed them the road. They walked on the whole night through, and at the break of day they came to their father's house. They knocked at the door, and when the wife opened it and saw that it was Hansel and Beate she said, "You naughty children, why did you sleep so long in the wood? we thought you were never coming home again!" But the father was glad, for it had gone to his heart to leave them both in the woods alone.
+Not very long after that there was again great scarcity in those parts, and the children heard their mother say at night in bed to their father, "Everything is finished up; we have only half a loaf, and after that the tale comes to an end. The children must be off; we will take them farther into the wood this time, so that they shall not be able to find the way back again; there is no other way to manage." The man felt sad at heart, and he thought, "It would better to share one's last morsel with one's children." But the wife would listen to nothing that he said, but scolded and reproached him. He who says A must say B too, and when a man has given in once he has to do it a second time.
+But the children were not asleep, and had heard all the talk. When the parents had gone to sleep Hansel got up to go out and get more flint stones, as he did before, but the wife had locked the door, and Hansel could not get out; but he comforted his little sister, and said, "Don't cry, Beate, and go to sleep quietly, and God will help us." Early the next morning the wife came and pulled the children out of bed. She gave them each a little piece of "bread -less than before; and on the way to the wood Hansel crumbled the bread in his pocket, and often stopped to throw a crumb on the ground. "Hansel, what are you stopping behind and staring for?" said the father.
+"I am looking at my little pigeon sitting on the roof, to say good-bye to me," answered Hansel. "You fool," said the wife, "that is no pigeon, but the morning sun shining on the chimney pots." Hansel went on as before, and strewed bread crumbs all along the road. The woman led the children far into the wood, where they had never been before in all their lives. And again there was a large fire made, and the mother said, "Sit still there, you children, and when you are tired you can go to sleep; we are going into the forest to cut wood, and in the evening, when we are ready to go home we will come and fetch you."
+So when noon came Beate shared her bread with Hansel, who had strewed his along the road. Then they went to sleep, and the evening passed, and no one came for the poor children. When they awoke it was dark night, and Hansel comforted his little sister, and said, "Wait a little, Beate, until the moon gets up, then we shall be able to see the way home by the crumbs of bread that I have scattered along it."
+So when the moon rose they got up, but they could find no crumbs of bread, for the birds of the woods and of the fields had come and picked them up. Hansel thought they might find the way all the same, but they could not. They went on all that night, and the next day from the morning until the evening, but they could not find the way out of the wood, and they were very hungry, for they had nothing to eat but the few berries they could pick up. And when they were so tired that they could no longer drag themselves along, they lay down under a tree and fell asleep.
+It was now the third morning since they had left their father's house. They were always trying to get back to it, but instead of that they only found themselves farther in the wood, and if help had not soon come they would have been starved.
+About noon they saw a pretty snow-white bird sitting on a bough, and singing so sweetly that they stopped to listen. And when he had finished the bird spread his wings and flew before them, and they followed after him until they came to a little house, and the bird perched on the roof, and when they came nearer they saw that the house was built of bread, and roofed with cakes; and the window was of transparent sugar. "We will have some of this," said Hansel, "and make a fine meal. I will eat a piece of the roof, Beate, and you can have some of the window-that will taste sweet." So Hansel reached up and broke off a bit of the roof, just to see how it tasted, and Beate stood by the window and gnawed at it. Then they heard a thin voice call out from inside,
+"Nibble, nibble, like a mouse,
+Who is nibbling at my house?"
+And the children answered,
+"Never mind, It is the wind."
+And they went on eating, never disturbing themselves. Hansel, who found that the roof tasted very nice, took down a great piece of it, and Beate pulled out a large round window-pane, and sat her down and began upon it.
+Then the door opened, and an aged woman came out, leaning upon a crutch. Hansel and Beate felt very frightened, and let fall what they had in their hands. The old woman, however, nodded her head, and said, "Ah, my dear children, how come you here? you must come indoors and stay with me, you will be no trouble." So she took them each by the hand, and led them into her little house. And there they found a good meal laid out, of milk and pancakes, with sugar, apples, and nuts. After that she showed them two little white beds, and Hansel and Beate laid themselves down on them, and thought they were in heaven.
+The old woman, although her behaviour was so kind, was a wicked witch, who lay in wait for children, and had built the little house on purpose to entice them. When they were once inside she used to kill them, cook them, and eat them, and then it was a feast day with her. The witch's eyes were red, and she could not see very far, but she had a keen scent, like the beasts, and knew very well when human creatures were near. When she knew that Hansel and Beate were coming, she gave a spiteful laugh, and said triumphantly, "I have them, and they shall not escape me!"
+Early in the morning, before the children were awake, she got up to look at them, and as they lay sleeping so peacefully with round rosy cheeks, she said to herself, "What a fine feast I shall have!" Then she grasped Hansel with her withered hand, and led him into a little stable, and shut him up behind a grating; and call and scream as he might, it was no good. Then she went back to Beate and shook her, crying, "Get up, lazy bones; fetch water, and cook something nice for your brother; he is outside in the stable, and must be fattened up. And when he is fat enough I will eat him." Beate began to weep bitterly, but it was of no use, she had to do what the wicked witch bade her. And so the best kind of victuals was cooked for poor Hansel, while Beate got nothing but crab-shells.
+Each morning the old woman visited the little stable, and cried, "Hansel, stretch out your finger, that I may tell if you will soon be fat enough." Hansel, however, used to hold out a little bone, and the old woman, who had weak eyes, could not see what it was, and supposing it to be Hansel's finger, wondered very much that it was not getting fatter.
+When four weeks had passed and Hansel seemed to remain so thin, she lost patience and could wait no longer. "Now then, Beate," cried she to the little girl; "be quick and draw water; be Hansel fat or be he lean, tomorrow I must kill and cook him." Oh what a grief for the poor little sister to have to fetch water, and how the tears flowed down over her cheeks! "Dear God, pray help us!" cried she; "if we had been devoured by wild beasts in the wood at least we should have died together."
+"Spare me your lamentations," said the old woman; "they are of no avail." Early next morning Beate had to get up, make the fire, and fill the kettle. "First we will do the baking," said the old woman; "I nave heated the oven already, and kneaded the dough." She pushed poor Beate towards the oven, out of which the flames were already shining.
+"Creep in," said the witch, "and see if it is properly hot, so that the bread may be baked." And Beate once in, she meant to shut the door upon her and let her be baked, and then she would have eaten her. But Beate perceived her intention, and said, "I don't know how to do it: how shall I get in?"
+"Stupid goose," said the old woman, "the opening is big enough, do you see? I could get in myself!" and she stooped down and put her head in the oven's mouth. Then Beate gave her a push, so that she went in farther, and she shut the iron door upon her, and put up the bar. Oh how frightfully she howled! but Beate ran away, and left the wicked witch to burn miserably.
+Beate went straight to Hansel, opened the stable-door, and cried, "Hansel, we are free! the old witch is dead!" Then out flew Hansel like a bird from its cage as soon as the door is opened. How rejoiced they both were! how they fell each on the other's neck! and danced about, and kissed each other! And as they had nothing more to fear they went over all the old witch's house, and in every corner there stood chests of pearls and precious stones. "This is something better than flint stones," said Hansel, as he filled his pockets, and Beate, thinking she also would like to carry something home with her, filled her apron full. i! Now, away we go," said Hansel, "if we only can get out of the witch's wood." When they had journeyed a few hours they came to a great piece of water. "We can never get across this," said Hansel, "I see no stepping-stones and no bridge."
+"And there is no boat either," said Beate; "but here comes a white duck; if I ask her she will help us over." So she cried,
+"Duck, duck, here we stand,
+Hansel and Beate, on the land,
+Stepping-stones and bridge we lack,
+Carry us over on your nice white back."
+And the duck came accordingly, and Hansel got upon her and told his sister to come too. "No," answered Beate, "that would be too hard upon the duck; we can go separately, one after the other." And that was how it was managed, and after that they went on happily, until they came to the wood, and the way grew more and more familiar, till at last they saw in the distance their father's house. Then they ran till they came up to it, rushed in at the door, and fell on their father's neck. The man had not had a quiet hour since he left his children in the wood; but the wife was dead. And when Beate opened her apron the pearls and precious stones were scattered all over the room, and Hansel took one handful after another out of his pocket. Then was all care at an end, and they lived in great joy together. My tale is done, there runs a mouse, whosoever catches it, may make himself a big fur cap out of it.
\ No newline at end of file
diff --git a/myfirstrag/myfirstrag.ipynb b/myfirstrag/myfirstrag.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..a8cab10e19dc77c44812c850672ac646adaeb14e
--- /dev/null
+++ b/myfirstrag/myfirstrag.ipynb
@@ -0,0 +1 @@
+{"cells":[{"cell_type":"markdown","metadata":{},"source":["# My first RAG"]},{"cell_type":"code","execution_count":null,"metadata":{},"outputs":[],"source":["#!pip install llama_index\n","#!pip install llama_index.core\n","#!pip install llama_index.core.embeddings\n","#!pip install llama_index.embeddings.huggingface\n","#!pip install llama_index.llms.groq\n","\n","import os\n","from llama_index.core import VectorStoreIndex, Settings, SimpleDirectoryReader\n","from llama_index.core.embeddings import resolve_embed_model\n","from llama_index.embeddings.openai import OpenAIEmbedding\n","from llama_index.llms.openai import OpenAI\n","from llama_index.llms.groq import Groq\n"]},{"cell_type":"markdown","metadata":{},"source":["Uncomment for log messages (maybe later, there's going to be a lot):"]},{"cell_type":"code","execution_count":69,"metadata":{},"outputs":[],"source":["# import logging\n","# import sys\n","# logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)\n","# logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))"]},{"cell_type":"markdown","metadata":{},"source":["Set enrironment variable for your API Keys"]},{"cell_type":"code","execution_count":85,"metadata":{},"outputs":[],"source":["#os.environ['GROQ_API_KEY'] = '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'\n","#os.environ['OPENAI_API_KEY'] = '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'"]},{"cell_type":"markdown","metadata":{},"source":["chose your embedding model\\\n","(local or OpenAI)"]},{"cell_type":"code","execution_count":86,"metadata":{},"outputs":[],"source":["# embed_model = resolve_embed_model(\"local:BAAI/bge-small-en-v1.5\")\n","# embed_model = OpenAIEmbedding()"]},{"cell_type":"markdown","metadata":{},"source":["load textual data from directory, using the SimpleDirectoryReader connector<br>\n","check https://llamahub.ai/ for more opportunities to get your data in there"]},{"cell_type":"code","execution_count":null,"metadata":{},"outputs":[],"source":["documents = SimpleDirectoryReader(\"./data/\").load_data()\n","print(documents[0])\n","#for i in documents:\n","#    print(i)"]},{"cell_type":"markdown","metadata":{},"source":["Setings for chatmodel and embedding Model"]},{"cell_type":"code","execution_count":82,"metadata":{},"outputs":[],"source":["# llm = Groq(model=\"llama3-70b-8192\")\n","llm = OpenAI(model=\"gpt-4o-mini\")\n","Settings.llm = llm\n","Settings.embed_model = embed_model"]},{"cell_type":"markdown","metadata":{},"source":["create local index (note: this happens every time you call this script)"]},{"cell_type":"code","execution_count":5,"metadata":{},"outputs":[{"ename":"NameError","evalue":"name 'VectorStoreIndex' is not defined","output_type":"error","traceback":["\u001b[1;31m---------------------------------------------------------------------------\u001b[0m","\u001b[1;31mNameError\u001b[0m                                 Traceback (most recent call last)","Cell \u001b[1;32mIn[5], line 1\u001b[0m\n\u001b[1;32m----> 1\u001b[0m index \u001b[38;5;241m=\u001b[39m \u001b[43mVectorStoreIndex\u001b[49m\u001b[38;5;241m.\u001b[39mfrom_documents(documents)\n","\u001b[1;31mNameError\u001b[0m: name 'VectorStoreIndex' is not defined"]}],"source":["index = VectorStoreIndex.from_documents(documents)"]},{"cell_type":"markdown","metadata":{},"source":["Initialize a query engine from the given index."]},{"cell_type":"code","execution_count":80,"metadata":{},"outputs":[],"source":["query_engine = index.as_query_engine()"]},{"cell_type":"markdown","metadata":{},"source":["Send the query and see the response: "]},{"cell_type":"code","execution_count":null,"metadata":{},"outputs":[],"source":["response = query_engine.query(\"What was the name of Hensels Sister\")\n","print(response)"]},{"cell_type":"markdown","metadata":{},"source":["by the way ... this would be an easy shell prompt to run the queries (but without any context or chat capabilities)"]},{"cell_type":"markdown","metadata":{},"source":["__In case you are feeling bored:__ \n","* change settings for the llm and check how they affect the output\n","* try your own files\n","* Check LlamaIndex Documentation how to\n","    * persist index data\n","    * use llamaindex chat_engine instead of query engine\\\n","    https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/usage_pattern/\\\n","    * run in the shell with some sort of chat prompt, chat history etc. "]}],"metadata":{"kernelspec":{"display_name":"Python 3","language":"python","name":"python3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.12.3"}},"nbformat":4,"nbformat_minor":2}
diff --git a/parsing/llama-parse.ipynb b/parsing/llama-parse.ipynb
index 62d03d47775f874c6182339498371c5c24fa0940..757979f2f3f5124a6d3dd1efec539979bf123edc 100644
--- a/parsing/llama-parse.ipynb
+++ b/parsing/llama-parse.ipynb
@@ -6,8 +6,7 @@
    "source": [
     "# LlamaParse Basic example\n",
     "\n",
-    "* for WebGUI log into: https://cloud.llamaindex.ai/\n",
-    "* to optain an API-key click on API Key --> [+ Generate New Key]\n"
+    "* for WebGUI log into: https://cloud.llamaindex.ai/"
    ]
   },
   {
@@ -29,7 +28,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "* to optain an API-key, refer to https://cloud.llamaindex.ai/,  <br/> click on API Key --> [+ Generate New Key]"
+    "to optain an API-key, refer to https://cloud.llamaindex.ai/,  <br/> click on API Key --> [+ Generate New Key]"
    ]
   },
   {
@@ -45,7 +44,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "* Using simple parsing parsing on a pdf file, containing five pages of a comic book:"
+    "Using simple parsing parsing on a pdf file, containing five pages of a comic book:"
    ]
   },
   {
@@ -70,7 +69,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "* Using parsing instructions for better text extraction"
+    "Use parsing instructions for better text extraction"
    ]
   },
   {
@@ -241,7 +240,7 @@
     }
    ],
    "source": [
-    "for page in documents_instructed:\n",
+    "for page in documents_simple:\n",
     "    print(page.text)"
    ]
   }
diff --git a/useful_commands.ipynb b/useful_commands.ipynb
deleted file mode 100644
index 149b4dfbae54abe0e4ea3caca198f1893d0f7388..0000000000000000000000000000000000000000
--- a/useful_commands.ipynb
+++ /dev/null
@@ -1,183 +0,0 @@
-{
- "cells": [
-  {
-   "cell_type": "markdown",
-   "metadata": {
-    "vscode": {
-     "languageId": "shellscript"
-    }
-   },
-   "source": [
-    "##### Connecting to LLMs via API"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "* check API connections and available models"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {
-    "vscode": {
-     "languageId": "shellscript"
-    }
-   },
-   "outputs": [],
-   "source": [
-    "# List the models available\n",
-    "# get OpenAI API keys from https://platform.openai.com/api-keys\n",
-    "!curl https://api.openai.com/v1/models -H \"Authorization: Bearer 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\""
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {
-    "vscode": {
-     "languageId": "shellscript"
-    }
-   },
-   "outputs": [],
-   "source": [
-    "# List the models available (Groq)\n",
-    "# get Groq API key from https://console.groq.com/keys\n",
-    "!curl https://api.groq.com/openai/v1/models -H \"Authorization: Bearer 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\""
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "* Send a request via curl to the OpenAI API to generate embeddings for a given text input using the specified model.\n",
-    "<br/><i>(Note: Groq, does not provide embedidng models)</i>"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": null,
-   "metadata": {
-    "vscode": {
-     "languageId": "shellscript"
-    }
-   },
-   "outputs": [],
-   "source": [
-    "!curl https://api.openai.com/v1/embeddings \\\n",
-    "  -H \"Content-Type: application/json\" \\\n",
-    "  -H \"Authorization: Bearer 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\" \\\n",
-    "  -d '{\"model\": \"text-embedding-3-small\", \"input\": \"Your string to vectorize here.\" }'"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "* unsure what embedding models are available?"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 39,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current\n",
-      "                                 Dload  Upload   Total   Spent    Left  Speed\n",
-      "100  3941  100  3941    0     0   7757      0 --:--:-- --:--:-- --:--:--  7742\n",
-      "      \"id\": \"text-embedding-3-large\",\n",
-      "      \"id\": \"text-embedding-ada-002\",\n",
-      "      \"id\": \"text-embedding-3-small\",\n"
-     ]
-    }
-   ],
-   "source": [
-    "!curl https://api.openai.com/v1/models -H \"Authorization: Bearer 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\" | grep embed"
-   ]
-  },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": [
-    "* Chat completion via API"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 44,
-   "metadata": {
-    "vscode": {
-     "languageId": "shellscript"
-    }
-   },
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "{\n",
-      "  \"id\": \"chatcmpl-A96L0c3EjuDQpcQuq4xHS5gD2PbGD\",\n",
-      "  \"object\": \"chat.completion\",\n",
-      "  \"created\": 1726732678,\n",
-      "  \"model\": \"gpt-3.5-turbo-0125\",\n",
-      "  \"choices\": [\n",
-      "    {\n",
-      "      \"index\": 0,\n",
-      "      \"message\": {\n",
-      "        \"role\": \"assistant\",\n",
-      "        \"content\": \"You can get a chat completion by following these steps:\\n\\n1. Engage in meaningful conversation by asking questions and actively listening to the other person's responses.\\n2. Stay attentive and responsive throughout the chat to keep the conversation going and maintain a connection.\\n3. Show empathy and understanding towards the other person’s feelings and thoughts.\\n4. End the chat on a positive note by expressing gratitude, summarizing key points, and providing any necessary follow-up information.\\n5. Follow up with a polite closing statement\",\n",
-      "        \"refusal\": null\n",
-      "      },\n",
-      "      \"logprobs\": null,\n",
-      "      \"finish_reason\": \"length\"\n",
-      "    }\n",
-      "  ],\n",
-      "  \"usage\": {\n",
-      "    \"prompt_tokens\": 17,\n",
-      "    \"completion_tokens\": 100,\n",
-      "    \"total_tokens\": 117,\n",
-      "    \"completion_tokens_details\": {\n",
-      "      \"reasoning_tokens\": 0\n",
-      "    }\n",
-      "  },\n",
-      "  \"system_fingerprint\": null\n",
-      "}\n"
-     ]
-    }
-   ],
-   "source": [
-    "!curl https://api.openai.com/v1/chat/completions \\\n",
-    "  -H \"Content-Type: application/json\" \\\n",
-    "  -H \"Authorization: Bearer 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\" \\\n",
-    "  -d '{ \"model\": \"gpt-3.5-turbo\", \"messages\": [{\"role\": \"user\", \"content\": \"Hello, how can I get a chat completion?\"}],\"max_tokens\": 100 }'"
-   ]
-  }
- ],
- "metadata": {
-  "kernelspec": {
-   "display_name": "hackaton",
-   "language": "python",
-   "name": "python3"
-  },
-  "language_info": {
-   "codemirror_mode": {
-    "name": "ipython",
-    "version": 3
-   },
-   "file_extension": ".py",
-   "mimetype": "text/x-python",
-   "name": "python",
-   "nbconvert_exporter": "python",
-   "pygments_lexer": "ipython3",
-   "version": "3.10.14"
-  }
- },
- "nbformat": 4,
- "nbformat_minor": 2
-}
diff --git a/warmup/api_python.ipynb b/warmup/api_python.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..d9aea913c92214499f468459202600aa9d0d6f4b
--- /dev/null
+++ b/warmup/api_python.ipynb
@@ -0,0 +1,235 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "Testing connection a remote LLM"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "^C\n"
+     ]
+    },
+    {
+     "ename": "ModuleNotFoundError",
+     "evalue": "No module named 'llama_index.llms.groq'",
+     "output_type": "error",
+     "traceback": [
+      "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
+      "\u001b[1;31mModuleNotFoundError\u001b[0m                       Traceback (most recent call last)",
+      "Cell \u001b[1;32mIn[2], line 3\u001b[0m\n\u001b[0;32m      1\u001b[0m get_ipython()\u001b[38;5;241m.\u001b[39msystem(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mpip install llama_index\u001b[39m\u001b[38;5;124m'\u001b[39m)\n\u001b[1;32m----> 3\u001b[0m \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01mllama_index\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mllms\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mgroq\u001b[39;00m \u001b[38;5;28;01mimport\u001b[39;00m Groq\n\u001b[0;32m      4\u001b[0m \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01mllama_index\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mllms\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mopenai\u001b[39;00m \u001b[38;5;28;01mimport\u001b[39;00m OpenAI\n",
+      "\u001b[1;31mModuleNotFoundError\u001b[0m: No module named 'llama_index.llms.groq'"
+     ]
+    },
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Requirement already satisfied: llama_index in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (0.11.12)\n",
+      "Requirement already satisfied: llama-index-agent-openai<0.4.0,>=0.3.4 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.3.4)\n",
+      "Requirement already satisfied: llama-index-cli<0.4.0,>=0.3.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.3.1)\n",
+      "Requirement already satisfied: llama-index-core<0.12.0,>=0.11.11 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.11.12)\n",
+      "Requirement already satisfied: llama-index-embeddings-openai<0.3.0,>=0.2.4 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.2.5)\n",
+      "Requirement already satisfied: llama-index-indices-managed-llama-cloud>=0.3.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.3.1)\n",
+      "Requirement already satisfied: llama-index-legacy<0.10.0,>=0.9.48 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.9.48.post3)\n",
+      "Requirement already satisfied: llama-index-llms-openai<0.3.0,>=0.2.9 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.2.9)\n",
+      "Requirement already satisfied: llama-index-multi-modal-llms-openai<0.3.0,>=0.2.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.2.1)\n",
+      "Requirement already satisfied: llama-index-program-openai<0.3.0,>=0.2.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.2.0)\n",
+      "Requirement already satisfied: llama-index-question-gen-openai<0.3.0,>=0.2.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.2.0)\n",
+      "Requirement already satisfied: llama-index-readers-file<0.3.0,>=0.2.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.2.2)\n",
+      "Requirement already satisfied: llama-index-readers-llama-parse>=0.3.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (0.3.0)\n",
+      "Requirement already satisfied: nltk>3.8.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama_index) (3.9.1)\n",
+      "Requirement already satisfied: openai>=1.14.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-agent-openai<0.4.0,>=0.3.4->llama_index) (1.47.0)\n",
+      "Requirement already satisfied: PyYAML>=6.0.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (6.0.2)\n",
+      "Requirement already satisfied: SQLAlchemy>=1.4.49 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from SQLAlchemy[asyncio]>=1.4.49->llama-index-core<0.12.0,>=0.11.11->llama_index) (2.0.35)\n",
+      "Requirement already satisfied: aiohttp<4.0.0,>=3.8.6 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (3.10.5)\n",
+      "Requirement already satisfied: dataclasses-json in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (0.6.7)\n",
+      "Requirement already satisfied: deprecated>=1.2.9.3 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (1.2.14)\n",
+      "Requirement already satisfied: dirtyjson<2.0.0,>=1.0.8 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (1.0.8)\n",
+      "Requirement already satisfied: fsspec>=2023.5.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (2024.9.0)\n",
+      "Requirement already satisfied: httpx in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (0.27.2)\n",
+      "Requirement already satisfied: nest-asyncio<2.0.0,>=1.5.8 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (1.6.0)\n",
+      "Requirement already satisfied: networkx>=3.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (3.3)\n",
+      "Requirement already satisfied: numpy<2.0.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (1.26.4)\n",
+      "Requirement already satisfied: pillow>=9.0.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (10.4.0)\n",
+      "Requirement already satisfied: pydantic<3.0.0,>=2.7.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (2.9.2)\n",
+      "Requirement already satisfied: requests>=2.31.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (2.32.3)\n",
+      "Requirement already satisfied: tenacity!=8.4.0,<9.0.0,>=8.2.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (8.5.0)\n",
+      "Requirement already satisfied: tiktoken>=0.3.3 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (0.7.0)\n",
+      "Requirement already satisfied: tqdm<5.0.0,>=4.66.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (4.66.5)\n",
+      "Requirement already satisfied: typing-extensions>=4.5.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (4.12.2)\n",
+      "Requirement already satisfied: typing-inspect>=0.8.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (0.9.0)\n",
+      "Requirement already satisfied: wrapt in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-core<0.12.0,>=0.11.11->llama_index) (1.16.0)\n",
+      "Requirement already satisfied: llama-cloud>=0.0.11 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-indices-managed-llama-cloud>=0.3.0->llama_index) (0.0.17)\n",
+      "Requirement already satisfied: pandas in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-legacy<0.10.0,>=0.9.48->llama_index) (2.2.3)\n",
+      "Requirement already satisfied: beautifulsoup4<5.0.0,>=4.12.3 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-readers-file<0.3.0,>=0.2.0->llama_index) (4.12.3)\n",
+      "Requirement already satisfied: pypdf<5.0.0,>=4.0.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-readers-file<0.3.0,>=0.2.0->llama_index) (4.3.1)\n",
+      "Requirement already satisfied: striprtf<0.0.27,>=0.0.26 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-readers-file<0.3.0,>=0.2.0->llama_index) (0.0.26)\n",
+      "Requirement already satisfied: llama-parse>=0.5.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from llama-index-readers-llama-parse>=0.3.0->llama_index) (0.5.6)\n",
+      "Requirement already satisfied: click in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from nltk>3.8.1->llama_index) (8.1.7)\n",
+      "Requirement already satisfied: joblib in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from nltk>3.8.1->llama_index) (1.4.2)\n",
+      "Requirement already satisfied: regex>=2021.8.3 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from nltk>3.8.1->llama_index) (2024.9.11)\n",
+      "Requirement already satisfied: aiohappyeyeballs>=2.3.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.12.0,>=0.11.11->llama_index) (2.4.0)\n",
+      "Requirement already satisfied: aiosignal>=1.1.2 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.12.0,>=0.11.11->llama_index) (1.3.1)\n",
+      "Requirement already satisfied: attrs>=17.3.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.12.0,>=0.11.11->llama_index) (24.2.0)\n",
+      "Requirement already satisfied: frozenlist>=1.1.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.12.0,>=0.11.11->llama_index) (1.4.1)\n",
+      "Requirement already satisfied: multidict<7.0,>=4.5 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.12.0,>=0.11.11->llama_index) (6.1.0)\n",
+      "Requirement already satisfied: yarl<2.0,>=1.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.12.0,>=0.11.11->llama_index) (1.11.1)\n",
+      "Requirement already satisfied: async-timeout<5.0,>=4.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.12.0,>=0.11.11->llama_index) (4.0.3)\n",
+      "Requirement already satisfied: soupsieve>1.2 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from beautifulsoup4<5.0.0,>=4.12.3->llama-index-readers-file<0.3.0,>=0.2.0->llama_index) (2.6)\n",
+      "Requirement already satisfied: anyio in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from httpx->llama-index-core<0.12.0,>=0.11.11->llama_index) (4.6.0)\n",
+      "Requirement already satisfied: certifi in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from httpx->llama-index-core<0.12.0,>=0.11.11->llama_index) (2024.8.30)\n",
+      "Requirement already satisfied: httpcore==1.* in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from httpx->llama-index-core<0.12.0,>=0.11.11->llama_index) (1.0.5)\n",
+      "Requirement already satisfied: idna in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from httpx->llama-index-core<0.12.0,>=0.11.11->llama_index) (3.10)\n",
+      "Requirement already satisfied: sniffio in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from httpx->llama-index-core<0.12.0,>=0.11.11->llama_index) (1.3.1)\n",
+      "Requirement already satisfied: h11<0.15,>=0.13 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from httpcore==1.*->httpx->llama-index-core<0.12.0,>=0.11.11->llama_index) (0.14.0)\n",
+      "Requirement already satisfied: distro<2,>=1.7.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from openai>=1.14.0->llama-index-agent-openai<0.4.0,>=0.3.4->llama_index) (1.9.0)\n",
+      "Requirement already satisfied: jiter<1,>=0.4.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from openai>=1.14.0->llama-index-agent-openai<0.4.0,>=0.3.4->llama_index) (0.5.0)\n",
+      "Requirement already satisfied: annotated-types>=0.6.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from pydantic<3.0.0,>=2.7.0->llama-index-core<0.12.0,>=0.11.11->llama_index) (0.7.0)\n",
+      "Requirement already satisfied: pydantic-core==2.23.4 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from pydantic<3.0.0,>=2.7.0->llama-index-core<0.12.0,>=0.11.11->llama_index) (2.23.4)\n",
+      "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from requests>=2.31.0->llama-index-core<0.12.0,>=0.11.11->llama_index) (3.3.2)\n",
+      "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from requests>=2.31.0->llama-index-core<0.12.0,>=0.11.11->llama_index) (2.2.3)\n",
+      "Requirement already satisfied: greenlet!=0.4.17 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from SQLAlchemy>=1.4.49->SQLAlchemy[asyncio]>=1.4.49->llama-index-core<0.12.0,>=0.11.11->llama_index) (3.1.1)\n",
+      "Requirement already satisfied: colorama in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from tqdm<5.0.0,>=4.66.1->llama-index-core<0.12.0,>=0.11.11->llama_index) (0.4.6)\n",
+      "Requirement already satisfied: mypy-extensions>=0.3.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from typing-inspect>=0.8.0->llama-index-core<0.12.0,>=0.11.11->llama_index) (1.0.0)\n",
+      "Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from dataclasses-json->llama-index-core<0.12.0,>=0.11.11->llama_index) (3.22.0)\n",
+      "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from pandas->llama-index-legacy<0.10.0,>=0.9.48->llama_index) (2.9.0)\n",
+      "Requirement already satisfied: pytz>=2020.1 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from pandas->llama-index-legacy<0.10.0,>=0.9.48->llama_index) (2024.2)\n",
+      "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from pandas->llama-index-legacy<0.10.0,>=0.9.48->llama_index) (2024.1)\n",
+      "Requirement already satisfied: exceptiongroup>=1.0.2 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from anyio->httpx->llama-index-core<0.12.0,>=0.11.11->llama_index) (1.2.2)\n",
+      "Requirement already satisfied: packaging>=17.0 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from marshmallow<4.0.0,>=3.18.0->dataclasses-json->llama-index-core<0.12.0,>=0.11.11->llama_index) (24.1)\n",
+      "Requirement already satisfied: six>=1.5 in c:\\users\\lehmberg\\.conda\\envs\\hackathon\\lib\\site-packages (from python-dateutil>=2.8.2->pandas->llama-index-legacy<0.10.0,>=0.9.48->llama_index) (1.16.0)\n"
+     ]
+    }
+   ],
+   "source": [
+    "!pip install llama_index\n",
+    "\n",
+    "from llama_index.llms.groq import Groq\n",
+    "from llama_index.llms.openai import OpenAI"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "Set environment variables for your API Key(s) "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 9,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "import os\n",
+    "os.environ['GROQ_API_KEY'] = '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'\n",
+    "os.environ['OPENAI_API_KEY'] = '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'"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "task: decide for a model, enter your API keys, try another model, change temperature"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 15,
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Eine Frage, die viele Menschen schon einmal gestellt haben!\n",
+      "\n",
+      "Die Banane ist krumm, weil sie während ihres Wachstumsprozesses auf dem Bananenbaum eine natürliche Krümmung entwickelt. Dies liegt an der Art und Weise, wie die Banane wächst und reift.\n",
+      "\n",
+      "Banane sind eine Art von Frucht, die an der Spitze des Bananenbaums wächst. Die Frucht entwickelt sich aus einer Blüte, die sich an der Spitze eines langen Stiels befindet. Während die Banane wächst, wird sie von der Schwerkraft nach unten gezogen, was dazu führt, dass sie sich krümmt.\n",
+      "\n",
+      "Es gibt mehrere Gründe, warum Bananen krumm sind:\n",
+      "\n",
+      "1. **Schwerkraft**: Wie bereits erwähnt, wird die Banane von der Schwerkraft nach unten gezogen, was zu einer natürlichen Krümmung führt.\n",
+      "2. **Wachstumsprozess**: Die Banane wächst von der Spitze des Bananenbaums nach unten. Während sie wächst, entwickelt sie sich in einer spiraligen Form, was zu einer Krümmung führt.\n",
+      "3. **Zellstruktur**: Die Zellen in der Banane sind nicht gleichmäßig verteilt, was zu einer ungleichmäßigen Wachstumsrate führt. Dies kann zu einer Krümmung der Frucht führen.\n",
+      "4. **Evolutionäre Vorteile**: Die Krümmung der Banane kann auch evolutionäre Vorteile haben. Zum Beispiel kann die Krümmung der Frucht dazu beitragen, dass sie besser auf dem Boden liegt und weniger wahrscheinlich ist, dass sie herunterfällt.\n",
+      "\n",
+      "Es ist wichtig zu beachten, dass nicht alle Bananen krumm sind. Einige Sorten, wie die Cavendish-Banane, die in vielen Supermärkten verkauft wird, sind relativ gerade. Andere Sorten, wie die Plantain-Banane, sind jedoch sehr krumm.\n",
+      "\n",
+      "Ich hoffe, diese Antwort hat dir geholfen, die Frage zu beantworten, warum Bananen krumm sind!\n"
+     ]
+    }
+   ],
+   "source": [
+    "# pass model, API-key and temperature to the constructor for llm object\n",
+    "llm = Groq(model=\"llama3-70b-8192\", temperature=0.0)\n",
+    "#llm = OpenAI(model=\"gpt-3.5-turbo\", temperature=0.0)\n",
+    "\n",
+    "# complete the prompt\n",
+    "response = llm.complete(\"Warum ist die Banane krumm?\")\n",
+    "\n",
+    "print(response)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "\n",
+    "\n",
+    "# pass model, API-key and temperature to the constructor for llm object\n",
+    "llm1 = Groq(model=\"llama3-70b-8192\", temperature=0.0)\n",
+    "llm2 = OpenAI(model=\"gpt-3.5-turbo\", temperature=0.0)\n",
+    "\n",
+    "# complete the prompt\n",
+    "response1 = llm1.complete(\"Warum ist die Banane krumm?\")\n",
+    "\n",
+    "response2 = llm2.complete(\"Warum ist die Banane krumm?\")\n",
+    "\n",
+    "response3 = llm2.complete(\"Ich habe zwei Menschen gefragt, warum die Banane krumm ist? Person 1 sagt:\"+str(response1)+\" Person 2 sagt:\"+str(response2)+\"wer hat recht? Fasse beide Antworten zusammen und gib mir die Antwort\")\n",
+    "\n",
+    "\n",
+    "print(response3)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "hackaton",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.10.14"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/warmup/useful_commands.ipynb b/warmup/useful_commands.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..da91f5eb112b2cf529370e9b76ffa5d4a050fb78
--- /dev/null
+++ b/warmup/useful_commands.ipynb
@@ -0,0 +1,445 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {
+    "vscode": {
+     "languageId": "shellscript"
+    }
+   },
+   "source": [
+    "##### Connecting to LLMs via API"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "* check API connections and available models"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "metadata": {
+    "vscode": {
+     "languageId": "shellscript"
+    }
+   },
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "{\n",
+      "  \"object\": \"list\",\n",
+      "  \"data\": [\n",
+      "    {\n",
+      "      \"id\": \"dall-e-2\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1698798177,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4-1106-preview\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1698957206,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"tts-1-hd-1106\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1699053533,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"tts-1-hd\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1699046015,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"whisper-1\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1677532384,\n",
+      "      \"owned_by\": \"openai-internal\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"text-embedding-3-large\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1705953180,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"text-embedding-ada-002\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1671217299,\n",
+      "      \"owned_by\": \"openai-internal\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4o-2024-05-13\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1715368132,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4-0125-preview\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1706037612,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4-turbo-preview\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1706037777,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"tts-1-1106\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1699053241,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-3.5-turbo-16k\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1683758102,\n",
+      "      \"owned_by\": \"openai-internal\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"chatgpt-4o-latest\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1723515131,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4o-2024-08-06\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1722814719,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-3.5-turbo-1106\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1698959748,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-3.5-turbo-instruct-0914\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1694122472,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4o\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1715367049,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4-turbo-2024-04-09\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1712601677,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4-turbo\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1712361441,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4-0613\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1686588896,\n",
+      "      \"owned_by\": \"openai\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-3.5-turbo-0125\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1706048358,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1687882411,\n",
+      "      \"owned_by\": \"openai\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"text-embedding-3-small\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1705948997,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-3.5-turbo-instruct\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1692901427,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-3.5-turbo\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1677610602,\n",
+      "      \"owned_by\": \"openai\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4o-mini-2024-07-18\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1721172717,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"gpt-4o-mini\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1721172741,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"babbage-002\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1692634615,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"davinci-002\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1692634301,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"dall-e-3\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1698785189,\n",
+      "      \"owned_by\": \"system\"\n",
+      "    },\n",
+      "    {\n",
+      "      \"id\": \"tts-1\",\n",
+      "      \"object\": \"model\",\n",
+      "      \"created\": 1681940951,\n",
+      "      \"owned_by\": \"openai-internal\"\n",
+      "    }\n",
+      "  ]\n",
+      "}\n"
+     ]
+    },
+    {
+     "name": "stderr",
+     "output_type": "stream",
+     "text": [
+      "  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current\n",
+      "                                 Dload  Upload   Total   Spent    Left  Speed\n",
+      "\n",
+      "  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0\n",
+      "100  3941  100  3941    0     0   7481      0 --:--:-- --:--:-- --:--:--  7520\n",
+      "100  3941  100  3941    0     0   7478      0 --:--:-- --:--:-- --:--:--  7520\n"
+     ]
+    }
+   ],
+   "source": [
+    "# List the models available\n",
+    "# get OpenAI API keys from https://platform.openai.com/api-keys\n",
+    "!curl https://api.openai.com/v1/models -H \"Authorization: Bearer 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\""
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "metadata": {
+    "vscode": {
+     "languageId": "shellscript"
+    }
+   },
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "{\"object\":\"list\",\"data\":[{\"id\":\"llama-guard-3-8b\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Meta\",\"active\":true,\"context_window\":8192,\"public_apps\":null},{\"id\":\"distil-whisper-large-v3-en\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Hugging Face\",\"active\":true,\"context_window\":448,\"public_apps\":null},{\"id\":\"llava-v1.5-7b-4096-preview\",\"object\":\"model\",\"created\":1725402373,\"owned_by\":\"Other\",\"active\":true,\"context_window\":4096,\"public_apps\":null},{\"id\":\"gemma-7b-it\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Google\",\"active\":true,\"context_window\":8192,\"public_apps\":null},{\"id\":\"llama-3.1-8b-instant\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Meta\",\"active\":true,\"context_window\":131072,\"public_apps\":null},{\"id\":\"llama3-70b-8192\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Meta\",\"active\":true,\"context_window\":8192,\"public_apps\":null},{\"id\":\"whisper-large-v3\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"OpenAI\",\"active\":true,\"context_window\":448,\"public_apps\":null},{\"id\":\"gemma2-9b-it\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Google\",\"active\":true,\"context_window\":8192,\"public_apps\":null},{\"id\":\"llama-3.1-70b-versatile\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Meta\",\"active\":true,\"context_window\":131072,\"public_apps\":null},{\"id\":\"mixtral-8x7b-32768\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Mistral AI\",\"active\":true,\"context_window\":32768,\"public_apps\":null},{\"id\":\"llama3-8b-8192\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Meta\",\"active\":true,\"context_window\":8192,\"public_apps\":null},{\"id\":\"llama3-groq-8b-8192-tool-use-preview\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Groq\",\"active\":true,\"context_window\":8192,\"public_apps\":null},{\"id\":\"llama3-groq-70b-8192-tool-use-preview\",\"object\":\"model\",\"created\":1693721698,\"owned_by\":\"Groq\",\"active\":true,\"context_window\":8192,\"public_apps\":null}]}\n"
+     ]
+    },
+    {
+     "name": "stderr",
+     "output_type": "stream",
+     "text": [
+      "  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current\n",
+      "                                 Dload  Upload   Total   Spent    Left  Speed\n",
+      "\n",
+      "  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0\n",
+      "100  1894  100  1894    0     0   8293      0 --:--:-- --:--:-- --:--:--  8417\n"
+     ]
+    }
+   ],
+   "source": [
+    "# List the models available (Groq)\n",
+    "# get Groq API key from https://console.groq.com/keys\n",
+    "!curl https://api.groq.com/openai/v1/models -H \"Authorization: Bearer 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\""
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "* Send a request via curl to the OpenAI API to generate embeddings for a given text input using the specified model.\n",
+    "<br/><i>(Note: Groq, does not provide embedidng models)</i>"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 3,
+   "metadata": {
+    "vscode": {
+     "languageId": "shellscript"
+    }
+   },
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "{\n",
+      "    \"error\": {\n",
+      "        \"message\": \"We could not parse the JSON body of your request. (HINT: This likely means you aren't using your HTTP library correctly. The OpenAI API expects a JSON payload, but what was sent was not valid JSON. If you have trouble figuring out how to fix this, please contact us through our help center at help.openai.com.)\",\n",
+      "        \"type\": \"invalid_request_error\",\n",
+      "        \"param\": null,\n",
+      "        \"code\": null\n",
+      "    }\n",
+      "}\n"
+     ]
+    },
+    {
+     "name": "stderr",
+     "output_type": "stream",
+     "text": [
+      "  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current\n",
+      "                                 Dload  Upload   Total   Spent    Left  Speed\n",
+      "\n",
+      "  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0\n",
+      "100   451  100   443  100     8    811     14 --:--:-- --:--:-- --:--:--   830\n",
+      "100   451  100   443  100     8    811     14 --:--:-- --:--:-- --:--:--   830\n",
+      "curl: (3) URL rejected: Bad hostname\n",
+      "curl: (3) URL rejected: Port number was not a decimal number between 0 and 65535\n",
+      "curl: (3) URL rejected: Malformed input to a URL function\n",
+      "curl: (3) unmatched close brace/bracket in URL position 1:\n",
+      "}'\n",
+      " ^\n"
+     ]
+    }
+   ],
+   "source": [
+    "!curl https://api.openai.com/v1/embeddings \\\n",
+    "  -H \"Content-Type: application/json\" \\\n",
+    "  -H \"Authorization: Bearer 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\" \\\n",
+    "  -d '{\"model\": \"text-embedding-3-small\", \"input\": \"Ottos Mops kotzt\" }'"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "* unsure what embedding models are available?"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 39,
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current\n",
+      "                                 Dload  Upload   Total   Spent    Left  Speed\n",
+      "100  3941  100  3941    0     0   7757      0 --:--:-- --:--:-- --:--:--  7742\n",
+      "      \"id\": \"text-embedding-3-large\",\n",
+      "      \"id\": \"text-embedding-ada-002\",\n",
+      "      \"id\": \"text-embedding-3-small\",\n"
+     ]
+    }
+   ],
+   "source": [
+    "!curl https://api.openai.com/v1/models -H \"Authorization: Bearer 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\" | grep embed"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "* Chat completion via API"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 44,
+   "metadata": {
+    "vscode": {
+     "languageId": "shellscript"
+    }
+   },
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "{\n",
+      "  \"id\": \"chatcmpl-A96L0c3EjuDQpcQuq4xHS5gD2PbGD\",\n",
+      "  \"object\": \"chat.completion\",\n",
+      "  \"created\": 1726732678,\n",
+      "  \"model\": \"gpt-3.5-turbo-0125\",\n",
+      "  \"choices\": [\n",
+      "    {\n",
+      "      \"index\": 0,\n",
+      "      \"message\": {\n",
+      "        \"role\": \"assistant\",\n",
+      "        \"content\": \"You can get a chat completion by following these steps:\\n\\n1. Engage in meaningful conversation by asking questions and actively listening to the other person's responses.\\n2. Stay attentive and responsive throughout the chat to keep the conversation going and maintain a connection.\\n3. Show empathy and understanding towards the other person’s feelings and thoughts.\\n4. End the chat on a positive note by expressing gratitude, summarizing key points, and providing any necessary follow-up information.\\n5. Follow up with a polite closing statement\",\n",
+      "        \"refusal\": null\n",
+      "      },\n",
+      "      \"logprobs\": null,\n",
+      "      \"finish_reason\": \"length\"\n",
+      "    }\n",
+      "  ],\n",
+      "  \"usage\": {\n",
+      "    \"prompt_tokens\": 17,\n",
+      "    \"completion_tokens\": 100,\n",
+      "    \"total_tokens\": 117,\n",
+      "    \"completion_tokens_details\": {\n",
+      "      \"reasoning_tokens\": 0\n",
+      "    }\n",
+      "  },\n",
+      "  \"system_fingerprint\": null\n",
+      "}\n"
+     ]
+    }
+   ],
+   "source": [
+    "!curl https://api.openai.com/v1/chat/completions \\\n",
+    "  -H \"Content-Type: application/json\" \\\n",
+    "  -H \"Authorization: Bearer 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\" \\\n",
+    "  -d '{ \"model\": \"gpt-3.5-turbo\", \"messages\": [{\"role\": \"user\", \"content\": \"Hello, how can I get a chat completion?\"}],\"max_tokens\": 100 }'"
+   ]
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "hackaton",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.10.14"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}