Skip to content
Snippets Groups Projects
Commit 0d7f3331 authored by Alexandre Strube's avatar Alexandre Strube
Browse files

Update index.md to enhance structure and clarity, including revised subtitles,...

Update index.md to enhance structure and clarity, including revised subtitles, added usage section, and improved content organization
parent 3bb58931
Branches
No related tags found
No related merge requests found
--- ---
author: Alexandre Strube author: Alexandre Strube
title: BLABLADOR ![](images/blablador.png){ width=550px } title: BLABLADOR ![](images/blablador.png){ width=550px }
subtitle: JSC's experimental Large Language Model server subtitle: JSC's Inference Infrastructure
date: December 4th, 2024 date: December 4th, 2024
--- ---
# Take the slides with you ## Website
![https://go.fzj.de/2024-12-jsc-colloquium](images/talk-jsc-colloquium.png){ width=500px } ![https://helmholtz-blablador.fz-juelich.de](images/blablador-qrcode.png){width=450px}
- Play around! 🐶
---
## OUTLINE
- Past
- Present
- Future
--- ---
...@@ -22,13 +32,17 @@ date: December 4th, 2024 ...@@ -22,13 +32,17 @@ date: December 4th, 2024
--- ---
![](images/IMG_6426.mov)
---
# Blablador # Blablador
![](images/blablador-screenshot.png) ![](images/blablador-screenshot.png)
--- ---
# Why? # Past: Why?
- AI is becoming basic infrastructure - AI is becoming basic infrastructure
- Which historically is Open Source - Which historically is Open Source
...@@ -45,6 +59,10 @@ date: December 4th, 2024 ...@@ -45,6 +59,10 @@ date: December 4th, 2024
--- ---
![](images/IMG_6549.jpg)
---
# Why, part 2 # Why, part 2
- Projects like OpenGPT-X, TrustLLM need a place to run - Projects like OpenGPT-X, TrustLLM need a place to run
...@@ -54,7 +72,7 @@ date: December 4th, 2024 ...@@ -54,7 +72,7 @@ date: December 4th, 2024
--- ---
## Some facts ## Privacy is our selling point
- No data collection at all. I don't keep ***ANY*** data whatsoever! - No data collection at all. I don't keep ***ANY*** data whatsoever!
- You can use it AND keep your data private - You can use it AND keep your data private
...@@ -75,12 +93,7 @@ date: December 4th, 2024 ...@@ -75,12 +93,7 @@ date: December 4th, 2024
--- ---
## usage ![](images/IMG_6579.jpg)
![](images/usage_blablador_web.svg)
- Web UI only
- API usage wasn't recorded until we moved to a new host, devs still migrating
- Some healthy usage by tools e.g. B2DROP assistant: Around 400 requests/day (count as a single ip from B2DROP's server)
--- ---
...@@ -92,14 +105,35 @@ date: December 4th, 2024 ...@@ -92,14 +105,35 @@ date: December 4th, 2024
--- ---
# The LLM open ecosystem ![](images/IMG_6562.jpg)
- If it isn't on huggingface, it doesn't exist ---
# PRESENT
---
## Usage
![](images/usage_blablador_web.svg)
- Web UI only
- API usage wasn't recorded until we moved to a new host, devs still migrating
- Some healthy usage by tools e.g. B2DROP assistant: Around 400 requests/day (count as a single ip from B2DROP's server)
---
# The LLM ecosystem
- For open models: If it isn't on huggingface, it doesn't exist
- The "open" ecosystem is dominated by a few big players: Meta, Mistral.AI, Google - The "open" ecosystem is dominated by a few big players: Meta, Mistral.AI, Google
- Meta has the best open frontier-level models; but no training data
- Mistral is a close second - also no training data
- Google is catching up FAST on closed models (and leaving open behind)
- AllenAI is catching up and it's open
- Microsoft has tiny, bad ones (but I wouldn't bet against them) - Microsoft has tiny, bad ones (but I wouldn't bet against them)
- Apple is going their own way + using ChatGPT - Apple is going their own way + using ChatGPT
- Twitter/X has Grok-2 for paying customers, Grok-1 is enormous and "old" (from march) - Twitter/X has Grok-2 for paying customers, Grok-1 is enormous and "old" (from march)
- Google is catching up FAST on closed models
- Anthropic is receiving billions from Amazon but Claude is completely closed - Anthropic is receiving billions from Amazon but Claude is completely closed
--- ---
...@@ -176,10 +210,16 @@ date: December 4th, 2024 ...@@ -176,10 +210,16 @@ date: December 4th, 2024
# EU AI Act # EU AI Act
- "GPAI models present systemic risks when the cumulative amount of compute used for its training is greater than 10^25 floating point operations (FLOPs)"
- Moore's law has something to say about this
- TrustLLM and OpenGPT-X will have to comply - TrustLLM and OpenGPT-X will have to comply
- To be used commercially - To be used commercially
- Research-only models are exempt - Research-only models are exempt
- Bureaucratic shot in the foot: the EU AI Act will make it harder for EU models to compete with US ones - Bureaucratic shot in the foot: the EU AI Act will make it harder for EU models to compete internationally
---
![](images/IMG_6553.jpg)
--- ---
...@@ -193,6 +233,7 @@ date: December 4th, 2024 ...@@ -193,6 +233,7 @@ date: December 4th, 2024
- Small cluster inherited from other projects - Small cluster inherited from other projects
- Started small, with three nodes - Started small, with three nodes
- Runs Slurm, has a parallel FS/Storage, uses EasyBuild
--- ---
...@@ -215,15 +256,15 @@ date: December 4th, 2024 ...@@ -215,15 +256,15 @@ date: December 4th, 2024
--- ---
![Jureca-DC](images/jureca-dc.png) ## Big models need big hardware
--- - Llama 3.1 405b runs on WestAI nodes
- Launched with UNICORE
- QwQ runs there experimentally
## Website ---
![https://helmholtz-blablador.fz-juelich.de](images/blablador-qrcode.png){width=450px}
- Play around! 🐶 ![Jureca-DC](images/jureca-dc.png)
--- ---
...@@ -318,30 +359,73 @@ date: December 4th, 2024 ...@@ -318,30 +359,73 @@ date: December 4th, 2024
--- ---
# FUTURE
---
![Pickle is hungry](images/IMG_6573.jpg)
---
## Vision for the (near) future ## Vision for the (near) future
- Blablador as an umbrella for inference ---
- Use cases:
- LLMs for science # Blablador, the brand, as an umbrella for *ALL* inference at JSC
- Nasa's Prithvi 3 (currently being trained here)
- ESA's upcomping model ---
## JSC Inference umbrella
- Blablador as LLM is the first step
- Grow to include other types of models for Science and Industry
---
## Use cases so far
- LLMs for science (e.g. OpenGPT-X, TrustLLM, CosmoSage)
- Ancillary models stemming from [Prithvi-EO-2.0](https://www.earthdata.nasa.gov/dashboard/labs/earthinsights/explore) (300M and 600M), EO foundation model jointly developed by IBM, NASA, and JSC: Release tomorrow, Dec. 5th.
- Weather (big) and geospatial downstreams tasks
- NASA's [Helio](https://arxiv.org/abs/2410.10841) (downstream tasks in Heliophysics)
- JSC/ESA/DLR's upcomping model [FAST-EO](https://eo4society.esa.int/projects/fast-eo/)
- Health: Radiology with Aachen Uniklinik - Health: Radiology with Aachen Uniklinik
- ... - JSC/Cern/ECMWF's [Atmorep](https://www.atmorep.org)
- Open models:
- Pango weather
- Graphcast
- With privacy! - With privacy!
--- ---
![What do we have to do?](images/IMG_6551.jpg)
---
## Todo ## Todo
- Multi-modal models (text+image, text+audio, etc) - Multi-modality: video, audio, text, images
- Auto-RAG with privacy: - Auto-RAG with privacy:
- Easy to do badly. Hard to do securely. - Easy to do badly. Hard to do securely.
- Everything from the previous slide
---
## Potato
![](images/IMG_6561.jpg){ width=350px }
---
# Take the slides with you
![https://go.fzj.de/2024-12-jsc-colloquium](images/talk-jsc-colloquium.png){ width=500px }
--- ---
## Questions? ## Questions?
![](images/blablador-questions.jpg) ![No dogs have been harmed for this presentation](images/blablador-questions.jpg){ width=500px }
--- ---
...@@ -349,6 +433,14 @@ date: December 4th, 2024 ...@@ -349,6 +433,14 @@ date: December 4th, 2024
--- ---
## LLMOps resource
A No-BS Database of How Companies Actually Deploy LLMs in Production: 300+ Technical Case Studies, Including Self-Hosted LLMs in [https://www.zenml.io/llmops-database](https://www.zenml.io/llmops-database)
---
> "I think the complexity of Python package management holds down AI application development more than is widely appreciated. AI faces multiple bottlenecks — we need more GPUs, better algorithms, cleaner data in large quantities. But when I look at the day-to-day work of application builders, there’s one additional bottleneck that I think is underappreciated: The time spent wrestling with version management is an inefficiency I hope we can reduce. " > "I think the complexity of Python package management holds down AI application development more than is widely appreciated. AI faces multiple bottlenecks — we need more GPUs, better algorithms, cleaner data in large quantities. But when I look at the day-to-day work of application builders, there’s one additional bottleneck that I think is underappreciated: The time spent wrestling with version management is an inefficiency I hope we can reduce. "
Andrew Ng, 28.02.2024 Andrew Ng, 28.02.2024
......
File added
public/images/IMG_6549.jpg

749 KiB

public/images/IMG_6551.jpg

484 KiB

public/images/IMG_6553.jpg

681 KiB

public/images/IMG_6561.jpg

150 KiB

public/images/IMG_6562.jpg

578 KiB

public/images/IMG_6573.jpg

251 KiB

public/images/IMG_6579.jpg

384 KiB

...@@ -226,19 +226,32 @@ ...@@ -226,19 +226,32 @@
<section id="title-slide"> <section id="title-slide">
<h1 class="title">BLABLADOR <img data-src="images/blablador.png" <h1 class="title">BLABLADOR <img data-src="images/blablador.png"
width="550" /></h1> width="550" /></h1>
<p class="subtitle">JSC’s experimental Large Language Model server</p> <p class="subtitle">JSC’s Inference Infrastructure</p>
<p class="author">Alexandre Strube</p> <p class="author">Alexandre Strube</p>
<p class="date">December 4th, 2024</p> <p class="date">December 4th, 2024</p>
</section> </section>
<section id="take-the-slides-with-you" class="slide level1"> <section class="slide level1">
<h1>Take the slides with you</h1>
<h2 id="website">Website</h2>
<figure> <figure>
<img data-src="images/talk-jsc-colloquium.png" width="500" <img data-src="images/blablador-qrcode.png" width="450"
alt="https://go.fzj.de/2024-12-jsc-colloquium" /> alt="https://helmholtz-blablador.fz-juelich.de" />
<figcaption <figcaption
aria-hidden="true">https://go.fzj.de/2024-12-jsc-colloquium</figcaption> aria-hidden="true">https://helmholtz-blablador.fz-juelich.de</figcaption>
</figure> </figure>
<ul>
<li class="fragment">Play around! 🐶</li>
</ul>
</section>
<section class="slide level1">
<h2 id="outline">OUTLINE</h2>
<ul>
<li class="fragment">Past</li>
<li class="fragment">Present</li>
<li class="fragment">Future</li>
</ul>
</section> </section>
<section id="blablador" class="slide level1"> <section id="blablador" class="slide level1">
<h1>Blablador</h1> <h1>Blablador</h1>
...@@ -255,12 +268,17 @@ rank, some good, some awful)</li> ...@@ -255,12 +268,17 @@ rank, some good, some awful)</li>
and training code.</li> and training code.</li>
</ul> </ul>
</section> </section>
<section class="slide level1">
<p><video data-src="images/IMG_6426.mov" controls=""><a
href="images/IMG_6426.mov">Video</a></video></p>
</section>
<section id="blablador-1" class="slide level1"> <section id="blablador-1" class="slide level1">
<h1>Blablador</h1> <h1>Blablador</h1>
<p><img data-src="images/blablador-screenshot.png" /></p> <p><img data-src="images/blablador-screenshot.png" /></p>
</section> </section>
<section id="why" class="slide level1"> <section id="past-why" class="slide level1">
<h1>Why?</h1> <h1>Past: Why?</h1>
<ul> <ul>
<li class="fragment">AI is becoming basic infrastructure</li> <li class="fragment">AI is becoming basic infrastructure</li>
<li class="fragment">Which historically is Open Source</li> <li class="fragment">Which historically is Open Source</li>
...@@ -280,6 +298,10 @@ target 🎯💨</li> ...@@ -280,6 +298,10 @@ target 🎯💨</li>
</ul></li> </ul></li>
</ul> </ul>
</section> </section>
<section class="slide level1">
<p><img data-src="images/IMG_6549.jpg" /></p>
</section>
<section id="why-part-2" class="slide level1"> <section id="why-part-2" class="slide level1">
<h1>Why, part 2</h1> <h1>Why, part 2</h1>
<ul> <ul>
...@@ -294,7 +316,7 @@ run</li> ...@@ -294,7 +316,7 @@ run</li>
</section> </section>
<section class="slide level1"> <section class="slide level1">
<h2 id="some-facts">Some facts</h2> <h2 id="privacy-is-our-selling-point">Privacy is our selling point</h2>
<ul> <ul>
<li class="fragment">No data collection at all. I don’t keep <li class="fragment">No data collection at all. I don’t keep
<strong><em>ANY</em></strong> data whatsoever! <strong><em>ANY</em></strong> data whatsoever!
...@@ -324,15 +346,7 @@ it, contact me!</em></strong></li> ...@@ -324,15 +346,7 @@ it, contact me!</em></strong></li>
</section> </section>
<section class="slide level1"> <section class="slide level1">
<h2 id="usage">usage</h2> <p><img data-src="images/IMG_6579.jpg" /></p>
<p><img data-src="images/usage_blablador_web.svg" /></p>
<ul>
<li class="fragment">Web UI only</li>
<li class="fragment">API usage wasn’t recorded until we moved to a new
host, devs still migrating</li>
<li class="fragment">Some healthy usage by tools e.g. B2DROP assistant:
Around 400 requests/day (count as a single ip from B2DROP’s server)</li>
</ul>
</section> </section>
<section class="slide level1"> <section class="slide level1">
...@@ -345,23 +359,49 @@ Blablador’s API (VSCode’s Continue.dev, LangChain, etc)</li> ...@@ -345,23 +359,49 @@ Blablador’s API (VSCode’s Continue.dev, LangChain, etc)</li>
than unique visits/day)</li> than unique visits/day)</li>
</ul> </ul>
</section> </section>
<section id="the-llm-open-ecosystem" class="slide level1"> <section class="slide level1">
<h1>The LLM open ecosystem</h1>
<p><img data-src="images/IMG_6562.jpg" /></p>
</section>
<section id="present" class="slide level1">
<h1>PRESENT</h1>
</section>
<section class="slide level1">
<h2 id="usage">Usage</h2>
<p><img data-src="images/usage_blablador_web.svg" /></p>
<ul>
<li class="fragment">Web UI only</li>
<li class="fragment">API usage wasn’t recorded until we moved to a new
host, devs still migrating</li>
<li class="fragment">Some healthy usage by tools e.g. B2DROP assistant:
Around 400 requests/day (count as a single ip from B2DROP’s server)</li>
</ul>
</section>
<section id="the-llm-ecosystem" class="slide level1">
<h1>The LLM ecosystem</h1>
<ul> <ul>
<li class="fragment">If it isn’t on huggingface, it doesn’t exist</li> <li class="fragment">For open models: If it isn’t on huggingface, it
doesn’t exist</li>
<li class="fragment">The “open” ecosystem is dominated by a few big <li class="fragment">The “open” ecosystem is dominated by a few big
players: Meta, Mistral.AI, Google</li> players: Meta, Mistral.AI, Google</li>
<li class="fragment">Meta has the best open frontier-level models; but
no training data</li>
<li class="fragment">Mistral is a close second - also no training
data</li>
<li class="fragment">Google is catching up FAST on closed models (and
leaving open behind)</li>
<li class="fragment">AllenAI is catching up and it’s open</li>
<li class="fragment">Microsoft has tiny, bad ones (but I wouldn’t bet <li class="fragment">Microsoft has tiny, bad ones (but I wouldn’t bet
against them)</li> against them)</li>
<li class="fragment">Apple is going their own way + using ChatGPT</li> <li class="fragment">Apple is going their own way + using ChatGPT</li>
<li class="fragment">Twitter/X has Grok-2 for paying customers, Grok-1 <li class="fragment">Twitter/X has Grok-2 for paying customers, Grok-1
is enormous and “old” (from march)</li> is enormous and “old” (from march)</li>
<li class="fragment">Google is catching up FAST on closed models</li>
<li class="fragment">Anthropic is receiving billions from Amazon but <li class="fragment">Anthropic is receiving billions from Amazon but
Claude is completely closed</li> Claude is completely closed</li>
</ul> </ul>
</section> </section>
<section id="the-llm-ecosystem" class="slide level1"> <section id="the-llm-ecosystem-1" class="slide level1">
<h1>The LLM ecosystem</h1> <h1>The LLM ecosystem</h1>
<ul> <ul>
<li class="fragment">Evaluation is HARD <li class="fragment">Evaluation is HARD
...@@ -461,15 +501,23 @@ MIT</li> ...@@ -461,15 +501,23 @@ MIT</li>
<section id="eu-ai-act" class="slide level1"> <section id="eu-ai-act" class="slide level1">
<h1>EU AI Act</h1> <h1>EU AI Act</h1>
<ul> <ul>
<li class="fragment">“GPAI models present systemic risks when the
cumulative amount of compute used for its training is greater than 10^25
floating point operations (FLOPs)”</li>
<li class="fragment">Moore’s law has something to say about this</li>
<li class="fragment">TrustLLM and OpenGPT-X will have to comply <li class="fragment">TrustLLM and OpenGPT-X will have to comply
<ul> <ul>
<li class="fragment">To be used commercially</li> <li class="fragment">To be used commercially</li>
</ul></li> </ul></li>
<li class="fragment">Research-only models are exempt</li> <li class="fragment">Research-only models are exempt</li>
<li class="fragment">Bureaucratic shot in the foot: the EU AI Act will <li class="fragment">Bureaucratic shot in the foot: the EU AI Act will
make it harder for EU models to compete with US ones</li> make it harder for EU models to compete internationally</li>
</ul> </ul>
</section> </section>
<section class="slide level1">
<p><img data-src="images/IMG_6553.jpg" /></p>
</section>
<section id="juelich-supercomputing-centre" class="slide level1"> <section id="juelich-supercomputing-centre" class="slide level1">
<h1>Juelich Supercomputing Centre</h1> <h1>Juelich Supercomputing Centre</h1>
<figure> <figure>
...@@ -482,6 +530,8 @@ make it harder for EU models to compete with US ones</li> ...@@ -482,6 +530,8 @@ make it harder for EU models to compete with US ones</li>
<ul> <ul>
<li class="fragment">Small cluster inherited from other projects</li> <li class="fragment">Small cluster inherited from other projects</li>
<li class="fragment">Started small, with three nodes</li> <li class="fragment">Started small, with three nodes</li>
<li class="fragment">Runs Slurm, has a parallel FS/Storage, uses
EasyBuild</li>
</ul> </ul>
</section> </section>
<section class="slide level1"> <section class="slide level1">
...@@ -516,23 +566,21 @@ website</li> ...@@ -516,23 +566,21 @@ website</li>
</section> </section>
<section class="slide level1"> <section class="slide level1">
<figure> <h2 id="big-models-need-big-hardware">Big models need big hardware</h2>
<img data-src="images/jureca-dc.png" alt="Jureca-DC" /> <ul>
<figcaption aria-hidden="true">Jureca-DC</figcaption> <li class="fragment">Llama 3.1 405b runs on WestAI nodes
</figure> <ul>
<li class="fragment">Launched with UNICORE</li>
</ul></li>
<li class="fragment">QwQ runs there experimentally</li>
</ul>
</section> </section>
<section class="slide level1"> <section class="slide level1">
<h2 id="website">Website</h2>
<figure> <figure>
<img data-src="images/blablador-qrcode.png" width="450" <img data-src="images/jureca-dc.png" alt="Jureca-DC" />
alt="https://helmholtz-blablador.fz-juelich.de" /> <figcaption aria-hidden="true">Jureca-DC</figcaption>
<figcaption
aria-hidden="true">https://helmholtz-blablador.fz-juelich.de</figcaption>
</figure> </figure>
<ul>
<li class="fragment">Play around! 🐶</li>
</ul>
</section> </section>
<section class="slide level1"> <section class="slide level1">
...@@ -671,39 +719,106 @@ on their IEK7Cloud</li> ...@@ -671,39 +719,106 @@ on their IEK7Cloud</li>
<p><a <p><a
href="https://github.com/haesleinhuepf/bia-bob/blob/main/README.md">https://github.com/haesleinhuepf/bia-bob</a></p> href="https://github.com/haesleinhuepf/bia-bob/blob/main/README.md">https://github.com/haesleinhuepf/bia-bob</a></p>
</section> </section>
<section id="future" class="slide level1">
<h1>FUTURE</h1>
</section>
<section class="slide level1">
<figure>
<img data-src="images/IMG_6573.jpg" alt="Pickle is hungry" />
<figcaption aria-hidden="true">Pickle is hungry</figcaption>
</figure>
</section>
<section class="slide level1"> <section class="slide level1">
<h2 id="vision-for-the-near-future">Vision for the (near) future</h2> <h2 id="vision-for-the-near-future">Vision for the (near) future</h2>
</section>
<section
id="blablador-the-brand-as-an-umbrella-for-all-inference-at-jsc"
class="slide level1">
<h1>Blablador, the brand, as an umbrella for <em>ALL</em> inference at
JSC</h1>
</section>
<section class="slide level1">
<h2 id="jsc-inference-umbrella">JSC Inference umbrella</h2>
<ul> <ul>
<li class="fragment">Blablador as an umbrella for inference</li> <li class="fragment">Blablador as LLM is the first step</li>
<li class="fragment">Use cases: <li class="fragment">Grow to include other types of models for Science
and Industry</li>
</ul>
</section>
<section class="slide level1">
<h2 id="use-cases-so-far">Use cases so far</h2>
<ul>
<li class="fragment">LLMs for science (e.g. OpenGPT-X, TrustLLM,
CosmoSage)</li>
<li class="fragment">Ancillary models stemming from <a
href="https://www.earthdata.nasa.gov/dashboard/labs/earthinsights/explore">Prithvi-EO-2.0</a>
(300M and 600M), EO foundation model jointly developed by IBM, NASA, and
JSC: Release tomorrow, Dec. 5th.
<ul> <ul>
<li class="fragment">LLMs for science</li> <li class="fragment">Weather (big) and geospatial downstreams tasks</li>
<li class="fragment">Nasa’s Prithvi 3 (currently being trained </ul></li>
here)</li> <li class="fragment">NASA’s <a
<li class="fragment">ESA’s upcomping model</li> href="https://arxiv.org/abs/2410.10841">Helio</a> (downstream tasks in
Heliophysics)</li>
<li class="fragment">JSC/ESA/DLR’s upcomping model <a
href="https://eo4society.esa.int/projects/fast-eo/">FAST-EO</a></li>
<li class="fragment">Health: Radiology with Aachen Uniklinik</li> <li class="fragment">Health: Radiology with Aachen Uniklinik</li>
<li class="fragment"></li> <li class="fragment">JSC/Cern/ECMWF’s <a
<li class="fragment">With privacy!</li> href="https://www.atmorep.org">Atmorep</a></li>
<li class="fragment">Open models:
<ul>
<li class="fragment">Pango weather</li>
<li class="fragment">Graphcast</li>
</ul></li> </ul></li>
<li class="fragment">With privacy!</li>
</ul> </ul>
</section> </section>
<section class="slide level1"> <section class="slide level1">
<figure>
<img data-src="images/IMG_6551.jpg" alt="What do we have to do?" />
<figcaption aria-hidden="true">What do we have to do?</figcaption>
</figure>
</section>
<section class="slide level1">
<h2 id="todo">Todo</h2> <h2 id="todo">Todo</h2>
<ul> <ul>
<li class="fragment">Multi-modal models (text+image, text+audio, <li class="fragment">Multi-modality: video, audio, text, images</li>
etc)</li>
<li class="fragment">Auto-RAG with privacy: <li class="fragment">Auto-RAG with privacy:
<ul> <ul>
<li class="fragment">Easy to do badly. Hard to do securely.</li> <li class="fragment">Easy to do badly. Hard to do securely.</li>
</ul></li> </ul></li>
<li class="fragment">Everything from the previous slide</li>
</ul> </ul>
</section> </section>
<section class="slide level1"> <section class="slide level1">
<h2 id="potato">Potato</h2>
<p><img data-src="images/IMG_6561.jpg" width="350" /></p>
</section>
<section id="take-the-slides-with-you" class="slide level1">
<h1>Take the slides with you</h1>
<figure>
<img data-src="images/talk-jsc-colloquium.png" width="500"
alt="https://go.fzj.de/2024-12-jsc-colloquium" />
<figcaption
aria-hidden="true">https://go.fzj.de/2024-12-jsc-colloquium</figcaption>
</figure>
</section>
<section class="slide level1">
<h2 id="questions">Questions?</h2> <h2 id="questions">Questions?</h2>
<p><img data-src="images/blablador-questions.jpg" /></p> <figure>
<img data-src="images/blablador-questions.jpg" width="500"
alt="No dogs have been harmed for this presentation" />
<figcaption aria-hidden="true">No dogs have been harmed for this
presentation</figcaption>
</figure>
</section> </section>
<section class="slide level1"> <section class="slide level1">
...@@ -711,6 +826,13 @@ etc)</li> ...@@ -711,6 +826,13 @@ etc)</li>
</section> </section>
<section class="slide level1"> <section class="slide level1">
<h2 id="llmops-resource">LLMOps resource</h2>
<p>A No-BS Database of How Companies Actually Deploy LLMs in Production:
300+ Technical Case Studies, Including Self-Hosted LLMs in <a
href="https://www.zenml.io/llmops-database">https://www.zenml.io/llmops-database</a></p>
</section>
<section class="slide level1">
<blockquote> <blockquote>
<p>“I think the complexity of Python package management holds down AI <p>“I think the complexity of Python package management holds down AI
application development more than is widely appreciated. AI faces application development more than is widely appreciated. AI faces
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment