They went through a phase where they were getting awful yields. No idea why they had those problems and others didn’t.
They went through a phase where they were getting awful yields. No idea why they had those problems and others didn’t.
The money was to build fabs, which they are still doing - which is costing them most of their money they struggle to afford because their current chips are awful.
Their fabs may be shit too. Hopefully the new ones are better
Have any of the tech media done any work on which generations get improvements from this? Zen 4&5 sure, but what about earlier chips?
There aren’t enough AI specialists. More are being created by picking up these projects.
The problem is that AI is too hyped and people are trying to solve things it probably can’t solve. The projects I have seen work are basically fancy data ingress/parsing/summarisation apps. That’s where the current AI tech can really shine.
If you have the ability to build an AI app in house - holy shit shit that can improve productivity. Copilot itself for office use… Meh so far.
To actually answer your question - yes, but the only times I actually find it useful is for tests, for everything else it’s usually iffy and takes longer.
Intelligently loading the window could be the next useful trick
I think that giving the LLM an API to access additional context and then making it more of an agent style process will give the most improvement.
Let it request the interface for the class your using, let it request the code for that extension method you call. I think that would solve a lot, but I still see a LOT of instances where it calls wrong class/method names randomly.
This would also require a lot more in depth (and language specific!) IDE integration though, so I forsee a lot of price hikes for IDEs in the near future!
I’m going to call BS on that unless they are hiding some new models with huge context windows…
For anything that’s not boilerplate, you have to type more as a prompt to the AI than just writing it yourself.
Also, if you have a behaviour/variable that is similar to something common, it will stubbornly refuse to do what you want.
Is it censorship, or stopping (actual) fake news and lies because Musk fired Twitters moderation team?
(I don’t actually know)
Still not showing anything from Lemmy right? I guess upvotes don’t convert to boosts? I tried favouriting the post, does that make it show?
It may be that other companies can compete using ARM/RISC architectures. The only reason the current duopoly exists is the cross licensing between x64 and x86, now that apple has proved ARM can be competitive we will see what happens there!
If they were removing sites people would bash them too, there is no way they can win.
deleted by creator
Depends if you trust it to actually work.
Could you not have just bought a lower power chip then?
Or does that loose you cores?
As long as the apps all work. So much stuff is browser based now, but something will always turns up that doesn’t work. Something like mandatory timesheet software, a bespoke tool etc.
Both intel and AMD are running the same instruction set though are they not? (Cross licensing x86/x64)
Isn’t this functionality already built into the default web UI?
What is their monetisation plan? Currently they don’t seem to have anything other than donations?
If I don’t give it any permissions, does it actually do anything though? They only run when the app is open I assume?