Skip to main content
Advanced Search
Search Terms
Content Type

Exact Matches
Tag Searches
Date Options
Updated after
Updated before
Created after
Created before

Search Results

248 total results found

Vorherige Versionen Von Texten

Personal System

Politik vor 2026-04-08

Personal System Vorherige Versionen Von Texten

Politik Man hört oft, man solle Menschen nicht nach ihrer politischen Einstellung beurteilen. Aber Politik ist nichts Abstraktes – sie ist persönlich. Sie entscheidet darüber, wie wir leben, wie frei wir sind und in was für einer Gesellschaft wir aufwachsen. F...

LLAMA.cpp RPC Test

AI & LLM

Notes and some Documentation about my tests with LLAMA.cpp RPC. Using multiple computers for larger inference with multiple GPUs

llama.cpp RPC Multi-GPU Setup Guide (Windows 11 & Linux Distributed Cluster)

AI & LLM LLAMA.cpp RPC Test

This was generated by NOTEBOOKLM from a lot of sources. Question I have three computers at home. one AMX Ryzen 7800 X3D with 32GB RAM, NVIDIA RTX 5070 12GB VRAM and windows 11. Two machines with an intel i5 11th gen, 16GB RAM and NVIDIA RTX 2070 tier 8GB VRAM...

Ranger Seelenwandler - Zustandsschaden (Soulbeat CondiBuilt)

Games and Virtual Worlds Guild Wars 2

Für den Ranger mit Seelenwandler mit Zustandsschaden

Some Output

AI & LLM LLAMA.cpp RPC Test

Gemma 4 26B A4B - Server llama-server.exe -hf ggml-org/gemma-4-26B-A4B-it-GGUF:Q4_K_M -ngl 99 --rpc 192.168.0.91:21000,192.168.0.92:21000 -c 4096 Input: Hello Gemma. HOw are you doing? HOw much do you know? Thinking "Hello Gemma. HOw are you doing? HOw much ...

LLAMA.cpp and RAG Resources To Read

AI & LLM

LLAMA.cpp https://retr0.blog/blog/llama-rpc-rce https://tekkix.com/articles/ai/2024/09/distributed-inference-llamacpp-via-rpc https://github.com/ggml-org/llama.cpp/blob/master/docs/docker.md https://huggingface.co/google/gemma-4-31B-it https://onyx.app/self-h...

Local Install and execution of LLMs with vLLM

AI & LLM

Make sure your CUDA and NVIDIA drivers are installed nvcc --version nvidia-smi Update system and install uv from https://astral.sh/ with instructions from this GitHub site https://github.com/astral-sh/uv: sudo apt update && sudo apt upgrade -y sudo apt insta...

Prompt Library

AI & LLM

A collection of Prompts to be used in OpenWebUI or other ChatBots

Context Extraction

AI & LLM Prompt Library

## Provide thread context as CSV You must now assume the role of a memory module in this system. Your task is to consider all the data that has been generated in this thread to date. From that data, you must isolate only the information that you have learne...

Summary Generators

AI & LLM Prompt Library

Consolidate Outputs In Conversation ## Consolidate Outputs In Conversation Produce a new output which consolidates all of the outputs which you have provided in this conversation up to this point. However, do not repeat information. Preface this consolidated ...

Document Generation

AI & LLM Prompt Library

Generate Tech Documentation This is a simple but effective command which I use after a successful debugging session when I want to use the context developed in the conversation in order to generate a document so that I can refer to it if I get stuck again in t...

Ultimate Angles - Winkel - Angles Collections

3D Printing 3D Printing Projects

Ultimate brackets collection https://makerworld.com/en/models/161705-ultimate-brackets-collection#profileId-182405

Update April 26

Über dieses Wiki Updates

Update auf 26.03.3 gemacht. Keine besonderen Vorkommnisse. Scheint noch alles zu funktionieren. Snapshot noch aktiv.