
The AnythingLLM is installed in Ubuntu server
In the system LLM set the system can connect to the Ollama server and get the models
But when chat in workspace the docker is exited
Show the info in browser

… and the docker logs
„/usr/local/bin/docker-entrypoint.sh: line 7: 115 Illegal instruction (core dumped) node /app/server/index.js“
What’s the problem?
How can I determine if my CPU supports AVX instructions?
root@pve-ai-llm-02:~# cat /proc/cpuinfo | grep -i avx2
root@pve-ai-llm-02:~#
AVX2 is not supported!
root@pve-ai-llm-02:~# docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d605cf5074a5 mintplexlabs/anythingllm „/bin/bash /usr/loca…“ 5 hours ago Up 19 seconds (healthy) 0.0.0.0:3001->3001/tcp, :::3001->3001/tcp AnythingLLM
root@pve-ai-llm-02:~# docker container logs d605cf5074a5
[collector] info: Collector hot directory and tmp storage wiped!
[collector] info: Document processor app listening on port 8888
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 391ms
Start using Prisma Client in Node.js (See: https://pris.ly/d/client)
„`
import { PrismaClient } from ‚@prisma/client‘
const prisma = new PrismaClient()
„`
or start using Prisma Client at the edge (See: https://pris.ly/d/accelerate)
„`
import { PrismaClient } from ‚@prisma/client/edge‘
const prisma = new PrismaClient()
„`
See other ways of importing Prisma Client: http://pris.ly/d/importing-client
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
Datasource „db“: SQLite database „anythingllm.db“ at „file:../storage/anythingllm.db“
21 migrations found in prisma/migrations
No pending migrations to apply.
┌─────────────────────────────────────────────────────────┐
│ Update available 5.3.1 -> 5.16.1 │
│ Run the following to update │
│ npm i –save-dev prisma@latest │
│ npm i @prisma/client@latest │
└─────────────────────────────────────────────────────────┘
[backend] info: [TELEMETRY ENABLED] Anonymous Telemetry enabled. Telemetry helps Mintplex Labs Inc improve AnythingLLM.
[backend] info: prisma:info
[backend] info: [TELEMETRY SENT]
[backend] info: [CommunicationKey] RSA key pair generated for signed payloads within AnythingLLM services.
[backend] info: [EncryptionManager] Self-assigning key & salt for encrypting arbitrary data.
[backend] info: Primary server in HTTP mode listening on port 3001
[backend] info: [BackgroundWorkerService] Feature is not enabled and will not be started.
[backend] error: undefined
[backend] error: undefined
[backend] error: undefined
[backend] error: undefined
[backend] info: [Event Logged] – update_llm_provider
[backend] info: [Event Logged] – update_embedding_engine
[backend] info: [Event Logged] – update_vector_db
[backend] info: [TELEMETRY SENT]
[backend] info: [Event Logged] – workspace_created
[backend] info: [TELEMETRY SENT]
[backend] info: [NativeEmbedder] Initialized
[backend] info: [NativeEmbedder] Initialized
/usr/local/bin/docker-entrypoint.sh: line 7: 107 Illegal instruction (core dumped) node /app/server/index.js
[collector] info: Collector hot directory and tmp storage wiped!
[collector] info: Document processor app listening on port 8888
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 386ms
Start using Prisma Client in Node.js (See: https://pris.ly/d/client)
„`
import { PrismaClient } from ‚@prisma/client‘
const prisma = new PrismaClient()
„`
or start using Prisma Client at the edge (See: https://pris.ly/d/accelerate)
„`
import { PrismaClient } from ‚@prisma/client/edge‘
const prisma = new PrismaClient()
„`
See other ways of importing Prisma Client: http://pris.ly/d/importing-client
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
Datasource „db“: SQLite database „anythingllm.db“ at „file:../storage/anythingllm.db“
21 migrations found in prisma/migrations
No pending migrations to apply.
[backend] info: [TELEMETRY ENABLED] Anonymous Telemetry enabled. Telemetry helps Mintplex Labs Inc improve AnythingLLM.
[backend] info: prisma:info
[backend] info: [TELEMETRY SENT]
[backend] info: [CommunicationKey] RSA key pair generated for signed payloads within AnythingLLM services.
[backend] info: [EncryptionManager] Loaded existing key & salt for encrypting arbitrary data.
[backend] info: Primary server in HTTP mode listening on port 3001
[backend] info: [BackgroundWorkerService] Feature is not enabled and will not be started.
root@pve-ai-llm-02:~#