Recreating NASA's work, QuickAdd now supports local LLMs, and book notes on Go and Elon

We managed to recreate NASA's work on the ChemCam calibration models. QuickAdd now supports local LLMs and external providers like Groq. Book notes on Let's Go! and Elon Musk.

Hey there.

I hope you’ve been doing great.

Let me also welcome to every new member! I’m thrilled to have you here.

In my last email, I wrote about how I was starting my final year at Uni, and how I’d be working at the intersection of science, space exploration, and artificial intelligence. It has been a great success so far.

We’re working with a member of ongoing NASA Mars missions to see if we can improve their models for analyzing rock samples on Mars. We spent the past semester recreating their current models, which you can see the code for here.

For the final semester, we’ll be working on creating more accurate and robust models using the latest advances in machine learning. I’m super excited by the prospect of contributing (if even just a little), and hope that our ideas turn out to work!

What I've made for you

I have a long queue of books to process, but here are my notes on two of the books I’ve read recently. 14 books left in the queue, so stay tuned for notes on more great books. You can also find me on Goodreads to see what I’m currently reading.

Let’s Go! by Alex Edwards

This is a terrific book for learning Go.

It's very practical and hands-on, while still providing a good amount of theory and background.

I bought this book because I wanted to learn Go, and perhaps pick up some best practices and fill in gaps about backend web development.

I think it did a great job at that.

Elon Musk by Walter Isaacson

Walter Isaacson is a great writer.

I enjoyed this book and noted a some very useful takeaways.

QuickAdd Update: Local & external LLM providers

On Feb 11 I announced that QuickAdd for Obsidian would support local LLMs for its AI Assistant.

This is now the case, as of version 1.8! 🥳

You can now use models like Mistral and Llama in your Obsidian vault via QuickAdd’s AI Assistant. This means you don’t have to send your data to OpenAI — you decide if you want to send it to your own server, computer, or a trusted external provider.

In the tweet below, I show how I’m using Groq to explain an equation in 1.5 seconds. And in the original announcement, I showed me using Mistral 7b locally.

Embrace local-first: use your own computer as an intellectual sparring partner.

If you are already using QuickAdd and just want to get started, here’s a guide.

Quote

Complaining is not a strategy. You have to work with the world as you find it, not as you would have it be.

Jeff Bezos