diff --git a/index.md b/index.md
index 9676c1d0eff60582e3be423526c1dba2fb1d248a..03fc15d51c69e19fa340b396274be338663a2c79 100644
--- a/index.md
+++ b/index.md
@@ -11,6 +11,61 @@ date: August 28, 2024
 
 ---
 
+# The LLM ecosystem
+
+- If it isn't on huggingface, it doesn't exist
+- Evaluation is HARD
+    - Benchmarks prove little
+    - 99% of new models are fine-tuned versions of existing ones...
+    - On the benchmarks themselves!
+        - Is this cheating?
+
+---
+
+![](images/llm-leaderboard-2024-08.png)
+
+---
+
+# Evaluating LLM models
+
+- LMSys' arena: [https://lmarena.ai](https://lmarena.ai)
+- Doesn't differentiate between open and closed
+- Open source gets buried under all private ones
+
+---
+
+# Open Source?
+
+- Very few models are really open source
+- Most are "open source" in the sense that you can download the weights
+- Either the code isn't available, or the data
+
+---
+
+# Open Source
+
+- Outside of Academia, the only one 100% open I am aware of is [OLMo](https://blog.allenai.org/hello-olmo-a-truly-open-llm-43f7e7359222) (I might be wrong)
+    - Has training code, weights and data, all open
+- German academia: [OpenGPT-X](https://opengpt-x.de/en/)
+    - For German businesses and academia
+    - Yet unclear if training data will be open
+- EU: [TrustLLM](https://trustllm.eu/)
+    - Less emphasis on English
+    - Fully open
+
+---
+
+# EU AI Act
+
+- TrustLLM will have to comply
+    - To be used commercially
+- Research-only models are exempt
+- Bureaucratic P.I.T.A. 💩
+
+---
+
+# Blablador
+
 ![](images/blablador-screenshot.png)
 
 ---
@@ -21,7 +76,7 @@ date: August 28, 2024
 - Bla-bla-bla 🗣️ + Labrador 🐕‍🦺
 - A stage for deploying and testing large language models
 - Models change constantly (constantly improving rank, some good, some awful)
-- Usually a small/fast model and fone of the top of the [HF's Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
+- A mix of small, fast models and large, slower ones - changes constantly
 - It is a web server and an api server, and training code.
 
 --- 
@@ -43,7 +98,7 @@ Andrew Ng, 28.02.2024
 
 - AI is becoming basic infrastructure
 - Which historically is Open Source
-- We train a lot, deploy little: _Here is your code/weights, tschüssi!_
+- We train a lot, deploy little: _Here is your code/weights, k.thnx.bye!_
 - Little experience with dealing with LLMs
 - From the tools point of view, this is a FAST moving target 🎯💨
 - Acquire local experience in issues like
@@ -61,7 +116,6 @@ Andrew Ng, 28.02.2024
 
 ## Some facts
 
-- ***I CAN HOST YOUR MODEL***
 - No data collection at all. I don't keep ***ANY*** data whatsoever!
     - You can use it AND keep your data private
     - No records? Privacy (and GDPR is happy)
@@ -89,6 +143,17 @@ Andrew Ng, 28.02.2024
 
 ---
 
+![Juwels BOOSTER](images/juwels-booster.jpg)
+
+---
+
+# Humbler beginnings...
+
+- Small cluster inherited from other projects
+- Started small, with three nodes
+
+---
+
 ![Haicluster](images/IMG_7685.jpg)
 
 ---
@@ -101,6 +166,16 @@ Andrew Ng, 28.02.2024
 
 ---
 
+# User demand is growing, we get more hardware
+
+- Currently around 300 unique users/day on the website
+- API usage is higher, growing and heavier
+
+---
+
+![Jureca-DC](images/jureca-dc.png)
+
+---
 
 ## Website
 
@@ -174,14 +249,13 @@ curl --header "Authorization: Bearer MY_TOKEN_GOES_HERE"   https://helmholtz-bla
 - Inside config.json, add at the `"models"` section:
 
 - ```json
-    {
-      "title": "Mistral helmholtz",
-      "provider": "openai",
-      "contextLength": 16384,
-      "model": "Mistral-7B-Instruct-v0.2",
-      "apiKey": "YOUR_TOKEN_GOES_HERE",
-      "apiBase": "https://helmholtz-blablador.fz-juelich.de:8000"
-    },
+ {
+      "model": "AUTODETECT",
+      "title": "Blablador",
+      "apiKey": "glpat-YOURKEYHERE",
+      "apiBase": "https://helmholtz-blablador.fz-juelich.de:8000/v1",
+      "provider": "openai"
+    }
 ```
 
 - Try with the other models you got from the API!
@@ -289,7 +363,7 @@ curl --header "Authorization: Bearer MY_TOKEN_GOES_HERE"   https://helmholtz-bla
 
 [https://github.com/haesleinhuepf/bia-bob](https://github.com/haesleinhuepf/bia-bob/blob/main/README.md)
 
----
+
 
 ## Todo
 
diff --git a/public/images/jureca-dc.png b/public/images/jureca-dc.png
new file mode 100644
index 0000000000000000000000000000000000000000..704a632194e63b3475a752453901784c87c81e3b
Binary files /dev/null and b/public/images/jureca-dc.png differ
diff --git a/public/images/llm-leaderboard-2024-08.png b/public/images/llm-leaderboard-2024-08.png
new file mode 100644
index 0000000000000000000000000000000000000000..3f2444b8d6fb55c091949cc7e9ff55459e5eeaea
Binary files /dev/null and b/public/images/llm-leaderboard-2024-08.png differ
diff --git a/public/index.html b/public/index.html
index 61e0c641930a316b2cd8d0cb5869bef51a6de083..7ab94a02e6b0d7914cfa0ed5b4d0fa26fd094401 100644
--- a/public/index.html
+++ b/public/index.html
@@ -241,12 +241,84 @@ alt="https://go.fzj.de/2024-08-euroscipy" />
 aria-hidden="true">https://go.fzj.de/2024-08-euroscipy</figcaption>
 </figure>
 </section>
+<section id="the-llm-ecosystem" class="slide level1">
+<h1>The LLM ecosystem</h1>
+<ul>
+<li class="fragment">If it isn’t on huggingface, it doesn’t exist</li>
+<li class="fragment">Evaluation is HARD
+<ul>
+<li class="fragment">Benchmarks prove little</li>
+<li class="fragment">99% of new models are fine-tuned versions of
+existing ones…</li>
+<li class="fragment">On the benchmarks themselves!
+<ul>
+<li class="fragment">Is this cheating?</li>
+</ul></li>
+</ul></li>
+</ul>
+</section>
 <section class="slide level1">
 
-<p><img data-src="images/blablador-screenshot.png" /></p>
+<p><img data-src="images/llm-leaderboard-2024-08.png" /></p>
+</section>
+<section id="evaluating-llm-models" class="slide level1">
+<h1>Evaluating LLM models</h1>
+<ul>
+<li class="fragment">LMSys’ arena: <a
+href="https://lmarena.ai">https://lmarena.ai</a></li>
+<li class="fragment">Doesn’t differentiate between open and closed</li>
+<li class="fragment">Open source gets buried under all private ones</li>
+</ul>
+</section>
+<section id="open-source" class="slide level1">
+<h1>Open Source?</h1>
+<ul>
+<li class="fragment">Very few models are really open source</li>
+<li class="fragment">Most are “open source” in the sense that you can
+download the weights</li>
+<li class="fragment">Either the code isn’t available, or the data</li>
+</ul>
+</section>
+<section id="open-source-1" class="slide level1">
+<h1>Open Source</h1>
+<ul>
+<li class="fragment">Outside of Academia, the only one 100% open I am
+aware of is <a
+href="https://blog.allenai.org/hello-olmo-a-truly-open-llm-43f7e7359222">OLMo</a>
+(I might be wrong)
+<ul>
+<li class="fragment">Has training code, weights and data, all open</li>
+</ul></li>
+<li class="fragment">German academia: <a
+href="https://opengpt-x.de/en/">OpenGPT-X</a>
+<ul>
+<li class="fragment">For German businesses and academia</li>
+<li class="fragment">Yet unclear if training data will be open</li>
+</ul></li>
+<li class="fragment">EU: <a href="https://trustllm.eu/">TrustLLM</a>
+<ul>
+<li class="fragment">Less emphasis on English</li>
+<li class="fragment">Fully open</li>
+</ul></li>
+</ul>
+</section>
+<section id="eu-ai-act" class="slide level1">
+<h1>EU AI Act</h1>
+<ul>
+<li class="fragment">TrustLLM will have to comply
+<ul>
+<li class="fragment">To be used commercially</li>
+</ul></li>
+<li class="fragment">Research-only models are exempt</li>
+<li class="fragment">Bureaucratic P.I.T.A. 💩</li>
+</ul>
 </section>
 <section id="blablador" class="slide level1">
 <h1>Blablador</h1>
+<p><img data-src="images/blablador-screenshot.png" /></p>
+</section>
+<section id="blablador-1" class="slide level1">
+<h1>Blablador</h1>
 <ul>
 <li class="fragment">/ˈblæblæˌdɔɹ/</li>
 <li class="fragment">Bla-bla-bla 🗣️ + Labrador 🐕‍🦺</li>
@@ -254,10 +326,8 @@ aria-hidden="true">https://go.fzj.de/2024-08-euroscipy</figcaption>
 models</li>
 <li class="fragment">Models change constantly (constantly improving
 rank, some good, some awful)</li>
-<li class="fragment">Usually a small/fast model and fone of the top of
-the <a
-href="https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard">HF’s
-Open LLM Leaderboard</a></li>
+<li class="fragment">A mix of small, fast models and large, slower ones
+- changes constantly</li>
 <li class="fragment">It is a web server and an api server, and training
 code.</li>
 </ul>
@@ -293,7 +363,7 @@ in computer science or software engineering.”</p>
 <li class="fragment">AI is becoming basic infrastructure</li>
 <li class="fragment">Which historically is Open Source</li>
 <li class="fragment">We train a lot, deploy little: <em>Here is your
-code/weights, tschüssi!</em></li>
+code/weights, k.thnx.bye!</em></li>
 <li class="fragment">Little experience with dealing with LLMs</li>
 <li class="fragment">From the tools point of view, this is a FAST moving
 target 🎯💨</li>
@@ -317,8 +387,6 @@ comes</li>
 
 <h2 id="some-facts">Some facts</h2>
 <ul>
-<li class="fragment"><strong><em>I CAN HOST YOUR
-MODEL</em></strong></li>
 <li class="fragment">No data collection at all. I don’t keep
 <strong><em>ANY</em></strong> data whatsoever!
 <ul>
@@ -358,6 +426,20 @@ documented or well-tested.</li>
 </section>
 <section class="slide level1">
 
+<figure>
+<img data-src="images/juwels-booster.jpg" alt="Juwels BOOSTER" />
+<figcaption aria-hidden="true">Juwels BOOSTER</figcaption>
+</figure>
+</section>
+<section id="humbler-beginnings" class="slide level1">
+<h1>Humbler beginnings…</h1>
+<ul>
+<li class="fragment">Small cluster inherited from other projects</li>
+<li class="fragment">Started small, with three nodes</li>
+</ul>
+</section>
+<section class="slide level1">
+
 <figure>
 <img data-src="images/IMG_7685.jpg" alt="Haicluster" />
 <figcaption aria-hidden="true">Haicluster</figcaption>
@@ -377,6 +459,22 @@ documented or well-tested.</li>
 <figcaption aria-hidden="true">Haicluster</figcaption>
 </figure>
 </section>
+<section id="user-demand-is-growing-we-get-more-hardware"
+class="slide level1">
+<h1>User demand is growing, we get more hardware</h1>
+<ul>
+<li class="fragment">Currently around 300 unique users/day on the
+website</li>
+<li class="fragment">API usage is higher, growing and heavier</li>
+</ul>
+</section>
+<section class="slide level1">
+
+<figure>
+<img data-src="images/jureca-dc.png" alt="Jureca-DC" />
+<figcaption aria-hidden="true">Jureca-DC</figcaption>
+</figure>
+</section>
 <section class="slide level1">
 
 <h2 id="website">Website</h2>
@@ -462,14 +560,13 @@ OpenAI-compatible API</li>
 <li class="fragment"><p>Inside config.json, add at the
 <code>"models"</code> section:</p></li>
 <li class="fragment"><div class="sourceCode" id="cb2"><pre
-class="sourceCode json"><code class="sourceCode json"><span id="cb2-1"><a href="#cb2-1" aria-hidden="true" tabindex="-1"></a>    <span class="fu">{</span></span>
-<span id="cb2-2"><a href="#cb2-2" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;title&quot;</span><span class="fu">:</span> <span class="st">&quot;Mistral helmholtz&quot;</span><span class="fu">,</span></span>
-<span id="cb2-3"><a href="#cb2-3" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;provider&quot;</span><span class="fu">:</span> <span class="st">&quot;openai&quot;</span><span class="fu">,</span></span>
-<span id="cb2-4"><a href="#cb2-4" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;contextLength&quot;</span><span class="fu">:</span> <span class="dv">16384</span><span class="fu">,</span></span>
-<span id="cb2-5"><a href="#cb2-5" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;model&quot;</span><span class="fu">:</span> <span class="st">&quot;Mistral-7B-Instruct-v0.2&quot;</span><span class="fu">,</span></span>
-<span id="cb2-6"><a href="#cb2-6" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;apiKey&quot;</span><span class="fu">:</span> <span class="st">&quot;YOUR_TOKEN_GOES_HERE&quot;</span><span class="fu">,</span></span>
-<span id="cb2-7"><a href="#cb2-7" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;apiBase&quot;</span><span class="fu">:</span> <span class="st">&quot;https://helmholtz-blablador.fz-juelich.de:8000&quot;</span></span>
-<span id="cb2-8"><a href="#cb2-8" aria-hidden="true" tabindex="-1"></a>    <span class="fu">}</span><span class="er">,</span></span></code></pre></div></li>
+class="sourceCode json"><code class="sourceCode json"><span id="cb2-1"><a href="#cb2-1" aria-hidden="true" tabindex="-1"></a> <span class="fu">{</span></span>
+<span id="cb2-2"><a href="#cb2-2" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;model&quot;</span><span class="fu">:</span> <span class="st">&quot;AUTODETECT&quot;</span><span class="fu">,</span></span>
+<span id="cb2-3"><a href="#cb2-3" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;title&quot;</span><span class="fu">:</span> <span class="st">&quot;Blablador&quot;</span><span class="fu">,</span></span>
+<span id="cb2-4"><a href="#cb2-4" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;apiKey&quot;</span><span class="fu">:</span> <span class="st">&quot;glpat-YOURKEYHERE&quot;</span><span class="fu">,</span></span>
+<span id="cb2-5"><a href="#cb2-5" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;apiBase&quot;</span><span class="fu">:</span> <span class="st">&quot;https://helmholtz-blablador.fz-juelich.de:8000/v1&quot;</span><span class="fu">,</span></span>
+<span id="cb2-6"><a href="#cb2-6" aria-hidden="true" tabindex="-1"></a>      <span class="dt">&quot;provider&quot;</span><span class="fu">:</span> <span class="st">&quot;openai&quot;</span></span>
+<span id="cb2-7"><a href="#cb2-7" aria-hidden="true" tabindex="-1"></a>    <span class="fu">}</span></span></code></pre></div></li>
 <li class="fragment"><p>Try with the other models you got from the
 API!</p></li>
 </ul>
@@ -612,9 +709,6 @@ on their IEK7Cloud</li>
 <p><img data-src="images/bia-bob.gif" /></p>
 <p><a
 href="https://github.com/haesleinhuepf/bia-bob/blob/main/README.md">https://github.com/haesleinhuepf/bia-bob</a></p>
-</section>
-<section class="slide level1">
-
 <h2 id="todo">Todo</h2>
 <ul>
 <li class="fragment">Multi-modal models (text+image, text+audio,