Last Notes
https://haven.dergigi.com/8bed90fd13aec991c5d15922cb847c4292f5c0dd4f0fa2232c53db6e7925a3c9.gif
I'm not sure I understand the question?
yo @nprofile…emrk I've had this idea stuck in the back of my head for a long while now, namely doing short video clips of certain @nprofile…fhtd segments. Generated video obviously, bonus points if it resembles the art style of Waking Life (2001). In the best case everything is fully automated, or someone like you does it and I pay sats. plz halp
(which is a form of pride, btw)
Arrogance. It fails because of arrogance.
#nevent1q…njra
i want to repost this harder
#nevent1q…tgcx
High fever & explosive diarrhea
GM https://haven.dergigi.com/8966516f28cfc98b7e7b21d0358a2c647597715050ff93da93f3e03e59996dca.jpg
https://haven.dergigi.com/5d46920be7877d761b267773553275507d84f4a7d88173ccb64fb7a6c75a7865.gif
In other words: use CodeRabbit?
You are indeed! https://haven.dergigi.com/7a88a93ba8f2cf6d77b10e470ea4b16e7230c374062ee5f7cf7a908483c2d0dc.gif
it's a shame that most people don't know this https://haven.dergigi.com/be7bde1a933cfcd2a70ca26503fd0b815e51862e6546a00fe9117dbbb77b4318.mp4
https://haven.dergigi.com/f1af86dd2d5bb8d0cdbdf08ff8096a0776ef794d05737ef4a663e571a7dc2700.gif
I decided that people don't read. But in all seriousness I'll probably sleep on it a couple of times. Ask me again in 2 weeks (tm)
not asking for node running, but thx
You're not a developer.
You're not a vibecoder.
You're not a PM.
You're a sloperator.
appreciate all the responses 🫂
#nevent1q…ua8n
GM https://haven.dergigi.com/0fa496778c534f1ae91200a93d1ee1dd08dd88f1ef1d98bbd30ecf7320852d7d.jpg
Yes, many ways to counteract this.
https://media.tenor.com/t0JvaQCobUgAAAAM/nic-cage-nicolas-cage.gif
https://media.tenor.com/iMF4Z8Stb8cAAAAM/nicholas-cage.gif
Thanks for the link though, good to see some experience reports.
"sloppedy slop slop but make it all lowercase"
Might be worth waiting for the M5?
Yes, Framework Max+ 395 (128GB) is definitely an option.
I have multiple NAS and plenty of disk space. What I'm talking about is running LLMs locally.
What do you have currently?
All I want to do is run models. Nothing else.
Having my eye on the ThinkStation PGX (GB10 / 128GB / 1TB). Should be able to run some of the more capable models quite well.
I've had a Start9 for a long time if that's what you mean...
considering buying hardware to run everything locally. Would should I buy? #asknostr