Any comments, contributions, or feedback? Ping me!
Follow @adlrocha Tweet to @adlrocha
June 2023 AI-related Reading List
I just found myself with a few dozens of tabs opened with AI-related resources that I wanted to read. I feel that having them in RAM (i.e. opened in my browser) was starting overwhelm me, as I felt I could never find the time to read them (leading to a really uncomfortable feeling while I had to make progress in my day-to-day job). Hence, I decided to periodically move from RAM to a “persistent storage” my reading lists so that I could pick them up once I have some time for them (and in the process share them with others that my be interested).
The check-box at the beginning of each item signals if I have read it and process it yet or not.
Learning Resources
- Transformer Architecture: The positional encoding
- Hugging Face’s (HF) Transformers Getting Started
- HF’s Diffusers Resources
- What is a vector database?
- Making LLMs even more accessible with bitsandbytes, 4-bit quantization and QLoRA
- A Gentle Introduction to 8-bit Matrix Multiplication for transformers at scale using transformers, accelerate and bitsandbytes
- LLM Powered Autonomous Agents
- Understanding Deep Mind’s Sorting Algorithm
- The Secret Sauce behind 100K context window in LLMs: all tricks in one place
Projects
- Karpathy’s minGPT implementation
- PrivateGPT
- QLoRA: Efficient Finetuning of Quantized LLMs
- Guanaco 7B Colab example
Papers
Any comments, contributions, or feedback? Ping me!