My wife and go on kicks of playing a bunch of Minecraft together which is amazing
My wife and go on kicks of playing a bunch of Minecraft together which is amazing
As others have said loss of interest can happen and the interest can of course come back with a vengeance. I’d recommend picking up another hobby until gaming suddenly grasps your interest again.
Two types of hobbies that have lasting positive impacts on people are creative hobbies and physical hobbies. Your brain is wired to invent and create and your body is wired to move, so being able to do each for fun is brilliant for your mental and physical health. Hop on a bicycle, go for a walk and enjoy the crisp fall air, stop off at that gym you forgot to cancel your membership for, and start doing it regularly.
For creative hobbies you can get a pack of printer paper for a couple of bucks and a pack of Crayola crayons or colored pencils and just start doodling. If you suck at drawing make wierd geometric shapes to rebuild the fine motor skills that computers have killed. Or if you want something more in-depth model making is always great because it has elements of fantasy while having entry points at any skill level. Personally I’ve been getting back into model railroading which if that seems boring to watch a train go around in circles, consider it has its own table top roleplay scene in the form of operations
At least as far as US law is concerned, a federally hosted and administrated social media platform gets interesting with America’s unusually strong free speech laws, since there’s content which is legal but unethical which they likely would not be allowed to block or moderate, such as bullying, hate speech, misinformation, etc. but also illegal content would be immediately moderated away, which might include content that falls into legal grey areas or ethical but technically illegal content, like someone copy/pasting the contents of a paywalled article, or discussing any kind of DRM or digital security bypass
Honestly I think there’s good reason for governments to host a Mastodon instance for their representatives to use for communications, but inviting the public to use it might get weird for sure
The upscaling technologies they’ve been building into modern graphics stacks also have benefits for much older games where the performance isn’t necessarily needed. There’s an old game I like to play, Railroad Tycoon 2, which doesn’t run at resolutions higher than 1024x768 and modern upscaling can make that game look absolutely gorgeous despite being 4+ pixels per original pixel. I’m sure it provides similar benefits to emulators and the like too!
Thunderbird still supports RSS, however I’ve found many news sources don’t provide proper RSS feeds anymore
I just accepted a job with a small MSP starting early next year. I kept a close ear out during the interview for signs of the classic MSP hell stuff that would chew through techs but it does look like I got a good one (small 8 or so man shop) but check in in about 3 months and we’ll see how I’m feeling haha
My longer term plan is to use this as a stepping stone to then move onto being in-house then figuring out my exit strategy before burnout takes me, which I’m thinking I’ll either be aiming to move into IT management or possibly moving into a business analytics or cloud administration type role. Technical sales probably wouldn’t be too bad either.
Especially with how normal memory tiering is nowadays, especially in the datacenter (Intel’s bread and butter) now that you can stick a box of memory on a CXL network and put the memory from your last gen servers you just retired into said box for a third or fourth tier of memory before swapping. And the fun not tiered memory stuff the CXL enables. Really CXL just enables so much cool stuff that it’s going to be incredible once that starts hitting small single row datacenters
The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard
Funnily enough this is actually changing because of the AI boom. Would-be buyers can’t get Nvidia AI cards so they’re buying AMD and Intel and reworking their stacks as needed. It helps that there’s also translation layers available now too which translate CUDA and other otherwise vebdor-specific stuff to the open protocols supported by Intel and AMD
He’s not wrong that GPUs in the desktop space are going away because SoCs are inevitably going to be the future. This isn’t because the market has demanded it or some sort of conspiracy, but literally we can’t get faster without chips getting smaller and closer together.
While I agree with you on a technical level, I read it as Pat Gelsinger intends to stop development of discrete graphics cards after Battlemage, which is disappointing but not surprising. Intel’s GPUs while incredibly impressive simply have an uphill battle for desktop users and particularly gamers to ensure every game a user wishes to run can generally run without compatibility problems.
Ideally Intel would keep their GPU department going because they have a fighting chance at holding a significant market share now that they’re past the hardest hurdles, but they’re in a hard spot financially so I can’t be surprised if they’re forced to divest from discrete GPUs entirely
Seriously putting a couple gigs of on-package graphics memory would completely change the game, especially if it does some intelligent caching and uses RAM for additional memory as needed.
I want to see what happens if Intel or AMD seriously let a generation rip with on package graphics memory for the iGPU. The only real drawback I could see is if the power/thermal budget just isn’t sufficient and it ends up with wonky performance (which I have seen on an overly thin and light laptop I have in my personal fleet. It’s got a Ryzen 2600 by memory that’s horribly thermally limited and because of that it leaves so much performance on the table)
To be fair, the arm SOCs on phones use BigLittle cores, where it will enable/disable cores on the fly and move software around so it’s either running on the Big high performance cores or the Little low power cores based on power budget needs at that second. So effectively not all of those 6+ cores would be available and in use at the same time on phones
I have to disagree. When I tried out a VR headset at a con I spent 2 hours with the headset on in Space Pirate Training Simulator thinking it had only been 20 minutes. This was the $250 Meta Quest 2 while I had a heavy backpack on my back because I didn’t have anyone with me to leave my bag with. I was trying to be conscious with not taking too much time with the headset so others could have a chance and figured about 15-20 minutes would be appropriate but apparently I was completely in the zone!
I can count on one hand how many times I’ve had that much of a time traveling game experience, so I’d say VR is a pretty dang cool experience and once hardware costs come down (or headsets become more ubiquitous) it’ll probably be a pretty big market for gamers, much like how consoles are now
They have a slim chance if they keep subsidizing VR headsets to hold a and luceative chunk of the VR market when that actually takes off. VR is genuinely cool enough that enough people will get hooked once they experience a headset on their face with a VR experience that jives with them
New business idea!
You raise a good point
Honestly for me it’s muscle memory from the Windows 95 days of “it is now safe to turn off your computer” but I also don’t trust the OS to correctly interpret the ACPI signal sent by the power button 100% of the time. Obviously I’m not an average user, but I could see where an average user might consistently single press the power button to turn off a computer
Well like a lover you must reach behind and underneath to turn them on!
…I seriously do not like Apples design language that basically requires me to fondle unseeable parts of the computer to find the power button. Too much risk of spiders back there!
Dare to be Stupid?
I need to give Matrix another try
In recent months YouTube has been suggesting content that screams “meninist/right wing onboarding” so if I just watched whatever it recommended I might be in a very different place right now…
I feel similarly. I work in an office that’s heavily invested in Microsoft for everything and when you use Microsoft everything Teams fits in really nicely with great outlook integration, Microsoft Loop integration, etc. and the experience on Teams is fine