Archive for Juli 4th, 2024

Klinikum Darmstadt & Agaplesion Elisabethenstift Darmstadt – wollen sich zusammenschließen zu einem bundesweit erstmaligen Konstrukt zwischen kommunalem und kirchlichem Krankenhaus

Donnerstag, Juli 4th, 2024

Proxmox Virtual Environment (VE) 8.2.4 – VMs im Arbeitsspeicher laufen lassen

Donnerstag, Juli 4th, 2024

Universitätsklinikum Aachen (UKA) – ist mehr als nur ein Krankenhaus

Donnerstag, Juli 4th, 2024

Proxmox Virtual Environment (VE) 8.2.4 – how to use your first local ‚Meta Llama 3‘ Large Language Model (LLM) project without the need for a GPU and now with Open WebUI and AnythingLLM

Donnerstag, Juli 4th, 2024

root@pve-ai-llm-01:~# docker container ls

CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
4a4fb2745a55 ghcr.io/open-webui/open-webui:main „bash start.sh“ 6 days ago Up 9 minutes (healthy) open-webui
root@pve-ai-llm-01:~#
root@pve-ai-llm-01:~# docker run -d -p 3001:3001 –name AnythingLLM –restart always mintplexlabs/anythingllm
Unable to find image ‚mintplexlabs/anythingllm:latest‘ locally
latest: Pulling from mintplexlabs/anythingllm
37aaf24cf781: Pull complete
4f4fb700ef54: Pull complete
f9b3a3c17e18: Pull complete
1921a8057676: Pull complete
81fdb4ddb4bd: Pull complete
232a668a11f5: Pull complete
f01b34815b00: Pull complete
43dac93afdc8: Pull complete
a2d98e6575fe: Pull complete
fd39a40ca0cc: Pull complete
7fceeae671a1: Pull complete
eb8184c79ec1: Pull complete
9dacbcaa61c2: Pull complete
5239da4508c8: Pull complete
367d2351b578: Pull complete
00c065c836ef: Pull complete
Digest: sha256:71dad99e531e76b52101a4626bb6b6e29dd43dac6809fd54d399c88d5b966bcd
Status: Downloaded newer image for mintplexlabs/anythingllm:latest
170b0a8be37d819a8db52b25fb237fe20fc2cab3d5944f732e3af42c5f13219d
root@pve-ai-llm-01:~#

root@pve-ai-llm-01:~# docker container ls
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
170b0a8be37d mintplexlabs/anythingllm „/bin/bash /usr/loca…“ 11 seconds ago Up 5 seconds (healthy) 0.0.0.0:3001->3001/tcp, :::3001->3001/tcp AnythingLLM
4a4fb2745a55 ghcr.io/open-webui/open-webui:main „bash start.sh“ 6 days ago Up 18 minutes (healthy) open-webui
root@pve-ai-llm-01:~#
root@pve-ai-llm-01:~# netstat -tulpn | grep LISTEN
tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN 721/python3
tcp 0 0 127.0.0.1:25 0.0.0.0:* LISTEN 500/master
tcp 0 0 127.0.0.54:53 0.0.0.0:* LISTEN 147/systemd-resolve
tcp 0 0 0.0.0.0:3001 0.0.0.0:* LISTEN 1571/docker-proxy
tcp 0 0 127.0.0.53:53 0.0.0.0:* LISTEN 147/systemd-resolve
tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 343/ollama
tcp6 0 0 ::1:25 :::* LISTEN 500/master
tcp6 0 0 :::3001 :::* LISTEN 1586/docker-proxy
tcp6 0 0 :::22 :::* LISTEN 1/init
root@pve-ai-llm-01:~#
## AnythingLLM aufrufen ##
http://pve-ai-llm-01:3001
root@pve-ai-llm-01:~# vi /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
Environment=“OLLAMA_HOST=0.0.0.0″
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment=“PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:
/bin:/usr/games:/usr/local/games:/snap/bin“
[Install]
WantedBy=default.target
root@pve-ai-llm-01:~#
root@pve-ai-llm-01:~# systemctl daemon-reload
root@pve-ai-llm-01:~# systemctl restart ollama
root@pve-ai-llm-01:~#
root@pve-ai-llm-01:~# netstat -tulpn | grep LISTEN
tcp 0 0 0.0.0.0:3001 0.0.0.0:* LISTEN 647/docker-proxy
tcp 0 0 127.0.0.54:53 0.0.0.0:* LISTEN 170/systemd-resolve
tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN 729/python3
tcp 0 0 127.0.0.53:53 0.0.0.0:* LISTEN 170/systemd-resolve
tcp 0 0 127.0.0.1:25 0.0.0.0:* LISTEN 441/master
tcp6 0 0 :::3001 :::* LISTEN 654/docker-proxy
tcp6 0 0 :::22 :::* LISTEN 1/init
tcp6 0 0 :::11434 :::* LISTEN 4071/ollama
tcp6 0 0 ::1:25 :::* LISTEN 441/master
## AnythingLLM Desktop for Windows aufrufen ##

AnythingLLM – is a free easy to use and versatile document chatbot it is made for people who want to chat with or create a custom knowledge base using existing documents and websites to do this is with Retrieval-Augmented Generation (RAG) a process of creating custom knowledge bases which conveniently sidesteps the need for finetuning an Large Language Model (LLM)

Donnerstag, Juli 4th, 2024

Krankenhaus Wittmund gGmbH – ein umfangreicher Umbau und Anbau für über € 30 Millionen steht bevor

Donnerstag, Juli 4th, 2024

Das Unfallkrankenhaus in Berlin Marzahn (ukb) – Verknüpfung von Beruf und Studium auf dem Gesundheitscampus

Donnerstag, Juli 4th, 2024