• 0 Posts
  • 10 Comments
Joined 8 months ago
cake
Cake day: March 15th, 2024

help-circle
  • A few years ago now I was thinking that it was about time for me to upgrade my desktop (with a case that dates back to 2000 or so, I guess they call them “sleepers” these days?) because some of my usual computer things were taking too long.

    And I realized that Intel was selling the 12th generation of the Core at that point, which means the next one was a 13th generation and I dono, I’m not superstitious but I figured if anything went wrong I’d feel pretty darn silly. So I pulled the trigger and got a 12th gen core processor and motherboard and a few other bits.

    This is quite amusing in retrospect.




  • I mean, I think he’s a textbook example of why not to do drugs and why we need to eat the rich, but I can understand the logic here.

    When you navigate a car as a human, you are using vision, not LIDAR. Outside of a few edge cases, you aren’t even using parallax to judge distances. Ergo, a LIDAR is not going to see the text on a sign, the reflective stripes on a truck, etc. And it gets confused differently than the eye, absorbed by different wavelengths, etc. And you can jam LIDAR if you want. Thus, if we were content to wait until the self-driving-car is actually safe before throwing it out into the world, we’d probably want the standard to be that it navigates as well as a human in all situations using only visual sensors.

    Except, there’s some huge problems that the human visual cortex makes look real easy. Because “all situations” means “understanding that there’s a kid playing in the street from visual cues so I’m going to assume they are going to do something dumb” or “some guy put a warning sign on the road and it’s got really bad handwriting”

    Thus, the real problem is that he’s not using LIDAR as harm reduction for a patently unsafe product, where the various failure modes of the LIDAR-equipped self-driving cars show that those aren’t safe either.


  • It’s important to realize that the nerd you saw on the news has always been someone wearing nerd as a costume and the entire history of technology is loaded with examples of the real nerd being marginalized. It’s just that in ages past the VC’s would give a smaller amount of money and require the startup to go through concrete milestones to unlock all of it so there was more of a chance for the founder’s dreams to smack up against reality before they were $230m in the hole with no product worth selling.


  • While there is arguably a larger pool of people who you can reach by not having open racism and the CEO whipping his dick out (and mysteriously not slamming it into his Tesla door, even if it is a masterful gambit) you can still get a lot of white men of privilege who are smart and hardworking who don’t nominally worry about being on the receiving end of most of the harassment so it’s OK as long as they end up part of the winning team because they’ll get mega stock bucks at the end. And this does extend to the factory floor, at least people’s impressions while joining the factory floor. They wouldn’t be an engineer but they’ll be a supervisor or something?

    It’s kinda un-earned? Like, there’s stories that people tell each other of questionable veracity? Some set of startups in the days of yore gave their cleaning staff or whatnot options so I think it’s become part of the cultural mythos now even if the reality is that the cleaning staff these days is contractors who are mistreated so even if it did actually happen then, it won’t happen now.

    And, dono, once you’ve solved the hard problems early on, there’s less of that drive to do the truly novel things and so you get more of the people who want to be part of a company that’s going to the top and wouldn’t mind if they could coast and/or fail upwards along the way.

    The problem is that employers tend to presume that they can continue to abuse people going forwards into the future because they’ve gotten away with it so far. Until they do things like yank offers from new college grads or laying off too many of the professional staff, at which point you’ve shattered the illusion.

    tl;dr: Elon sowing: Haha fuck yeah!!! Yes!!

    Elon reaping: Well this fucking sucks. What the fuck.




  • As best I can tell, the touchscreen is added at the concept phase by folks who mostly know what’s going to make people look at the car and want to buy it, several years before the car hits the market and well before the actual car electronics teams are involved.

    So, yeah, car UI/UX sucks right now because we’re seeing all of the things added to cars a few years ago in response to Tesla and implemented by people who think that just because they programmed a random car-focused microcontroller back in the day that this means that they understand all of the layers involved in a modern Linux or Android or Windows embedded car electronics unit including layer 8 of the OSI stack (meaning: interfacing with humans)

    But, yah, dono. I don’t actually have my own car. My spouse got a Mazda a bunch of years ago now and it has actually a pretty good touchscreen interface with physical controls such that if you want to dig into stuff, you can touchscreen but all of the common stuff is switches and knobs. The generation before that had way way too many buttons and it was just gag-me-with-a-spoon. The generation after that removed the touchscreen because the leadership at Mazda decided people were just not to be trusted with a touchscreen and I feel like they went a little too far in the wrong direction. Meanwhile, in airplane cockpit design, they put great pains into having you be able to navigate by touch where necessary such that all of the knobs are differently textured or shaped. And, as I said, I don’t actually have my own car, but I have to say that if I did have a car, I’d want it to be designed like that.