![](https://programming.dev/pictrs/image/b81990bc-be00-4002-bfbc-0ea56c57a554.png)
![](https://lemmy.world/pictrs/image/0d5e3a0e-e79d-4062-a7bc-ccc1e7baacf1.png)
Hey now! This design met all the PM’s requirements 😤
Father, Hacker (Information Security Professional), Open Source Software Developer, Inventor, and 3D printing enthusiast
Hey now! This design met all the PM’s requirements 😤
Just use Gentoo. Do it from scratch on the command line without the GUI installer like a pro 👍
At the very least you’ll learn how everything works at a deeper level.
You say that because you don’t realize the benefits:
There’s actually a lot more reasons but that’s probably enough for now 😁
I’d love to see more adoption of… I2C!
Bazillions of motherboards and SBCs support I2C and many have the ability to use it via GPIO pins or even have connectors just for I2C devices (e.g. QWIIC). Yet there’s very little in the way of things you can buy and plug in. It feels like such a waste!
There’s all sorts of neat and useful things we could plug in and make use of if only there were software to use it. For example, cheap color sensors, nifty gesture sensors, time-of-flight sensors, light sensors, and more.
There’s lmsensors
which knows I2C and can magically understand zillions of temperature sensors and PWM things (e.g. fan control). We need something like that for all those cool devices and chips that speak I2C.
If you’re putting in a dishwasher just drill the holes. Your landlord will thank you for saving them the trouble of having to do that themselves some day.
I don’t think any normal landlord would give two shits about some dishwasher-hose-sized holes drilled under a sink, between internal cabinet walls, that no one will ever see. Such holes are so far back and out of the way… No one would ever notice unless they’re missing.
Wait: Do the times listed on the screen of your washer/dryer actually reflect reality‽
My dryer will say it’s got 20 minutes remaining for like an hour and a half. And yes, I clean the lint screen and vent regularly (all the way up to the roof!).
They’ll help you develop and test your AI stuff on Linux but not Windows (I don’t think… Completely different team of engineers).
I’m wondering what will happen when loads of games have built-in generative AI… Will these two paths cross and finally give us Linux folks Nvidia (graphics) drivers that are actually good? 🤔
This is caused by your root controller’s limited bandwidth and it’s inability to handle that many 3.0 devices at the same time. Some of the newer motherboards with USB C PD have controllers in them that can do a lot more.
It’s basically a hack on part of the company that made the root controller IC. They know they only have enough internal bandwidth to support 16 USB 3.0 devices so they intentionally bork things when you plug in more than that since their Transaction Translator (TT) can’t handle more and they were too lazy to bother implementing the ability to share 2.0 and 3.0 properly.
I’m guessing the decision went something like this…
“We have enough bandwidth for 16 3.0 devices… What do we do if someone plugs in more than that?” “Only a few people will ever have that many! We don’t have the budget to handle every tiny little use case! Just ship it.”
So it’s not Linux fault in this case. Or at least, if it is (a problem with the driver) it’s because of some proprietary bullshit that the driver requires to function properly 🤷
^this. Even a $200 cheap ass printer can handle printing a knob like this and it’ll even keep your toast warm forever if you want 👍
It’ll also provide endless hours of fucking around learning a new hobby and limitless conversation options to get people to leave you alone at parties!
Assuming you’re in the northern hemisphere: For this winter it’s fine. It’ll gently heat your home while you game like it’s 1999. No worries 😁
However, once it starts to warm up you’ll want to send that motherboard+RAM+CPU to your local HAZMAT trash pickup/facility and get something newer. Might I suggest a nice 2020-ish desktop CPU? With a motherboard that supports Coreboot, of course!
https://doc.coreboot.org/mainboard/index.html
…and get yourself a nice Nvidia (sadly, because AMD and Intel are still far behind) GPU with at least 12GB of VRAM so you can have fun with the open source AI stuff (it’s a blast!). The more VRAM the better though so if you can pick up a 4060 Ti with 16GB cheap this spring that’ll be your best budget buy (endless uncensored fun) 👍
Seriously: If you haven’t got the hardware to run Stable Diffusion locally you’re missing out! It’s as fun and addicting as a really good game. Running it on some cloud service isn’t the same because at best they’ll be running stuff that’s weeks or months out of date (which is like a million years in AI time) and they don’t give you the same level of control/possibilities that you get when running your own stuff locally (run whatever models/LoRAs you want, whatever extensions you want, generating images without having to worry about overbearing censorship because it is that bad on public AI services–paid or not!).
No, don’t rip! Gently remove the bad RAM from the PC.
It’s because they cheaped out and used (cheap) electromechanical switches for the buttons and electromechanical rotary encoders for the knobs.
If they used magnetic hall effect switches they’d never glitch (unless the microcontroller itself is glitching). Hall effect switches are forever.
(And no: Even cars in Arizona don’t get hot enough to wreck rare earth magnets… They’ll lose strength slightly above 80°C but not enough to matter since the car knows its internal temp and can compensate if they didn’t get the better sensors that auto-compensate).
For reference, hall effect switches and encoders aren’t really that much more expensive for something like a car where you’re going to be using/making millions of them. It probably saves pennies per car to use the cheap switches.
Prediction: Murdoch will be dead by then. He’s 92.
Edit: I think we’ll see news that he’s dead by next Saturday. Why? Trying to cash in my hopium 👍
What if I make my own videos with actors and/or voices are entirely imaginary? I don’t have the resources to hire a videographer let alone an actor but I can write a script and use AI (and program/script things).
If I make something cool it’d be sad if no one watched it just because it didn’t use real human actors and voices.
I read the article! It suggests in a hundred different ways that Windows 11 sucks and that sticking it out with Windows 10 is a bad idea for a dozen different reasons.
The people here suggesting Linux nailed it. If you’re not using Linux at this point you’re just being lazy, IMHO. If you have any issues you can always just troubleshoot and fix it but based on the anecdotes posted so far it’s obvious no one claiming to have tried Linux has done much of that.
Get off your ass and learn something new for real or stop bitching and bend over for Microsoft with your wallet ready to pay them afterwards for the privilege.
People bitching about Windows on their personal PCs is like people who don’t vote bitching about politics.
I hear even the cows run Linux. They only peer into Windows from time to time.
As far as I know, black magic 🤷
I want more cores and more importantly, more ADC pins. Also, being able to use more PIOs simultaneously would be fantastic.
Furthermore I want one of those matrix modules aka “AI accelerators” to play with 😁
Great! Now we just need an announcement about the successor to the RP2040…
To be fair, is as “new” as what the major record labels put out!