Has anyone used AI to aid in repair?

the frustrations with AI are not really related to the model but the dataset. without reliable data, no technological medium is going to be a worthwhile resource

when someone's able to train an agent with accurate design and repair data, such a resource could be super helpful


The best data set in the world is right here.

All people have to do is search and read a couple of pages. It's really not hard.

Are we really so lazy that we need everything spoon-fed to us?
 
A friend at work was using AI to get a quick overview of an Ethernet protocol we are starting to work with. Seemed like it could almost be trusted right up until it described data in the header encoded in a 17-bit byte....
 
not disagreeing

The best data set in the world is right here.

generally agreed

All people have to do is search and read a couple of pages. It's really not hard.

define "spoon fed". 30 years ago we'd have geezers wagging their fingers at us for not scouring physical manuals and instead using search engines and .pdfs. seems to be arguably subjective. i'm not taking a side one way or the other because it's fruitless

Are we really so lazy that we need everything spoon-fed to us?

achieving maximum efficiency through the proliferation of technological advancement is what dictates personal and group behaviors in this world. water is wet, the sky is blue, and it is what it is unless we go full Kaczynski. history tells us there's no half measures here, just doesn't work that way in the industrial (and post-) era

everyone better cozy up to AI, it's already everywhere
 
the frustrations with AI are not really related to the model but the dataset. without reliable data, no technological medium is going to be a worthwhile resource

when someone's able to train an agent with accurate design and repair data, such a resource could be super helpful

The FPGA gaming revolution will do this. New KICAD schematics and VHDL/Verilog sources, plus MAME drivers should all be easily digestible by an LLM.
 
people will have major atrophy in an basic ability to think in the coming years... just imagine an entire generation of people who never really learned anything and just looked it up on some AI.

this is much worse than the fact that we don't remember phone numbers anymore or can't drive to a friggin grocery store 7 minutes away without navigation on.
 
absolutely. when contrasted with the tradeoffs, there's little to nothing about efficiency (as a value) that improves the human experience in meaningful ways.
the best thing to happen to the human race would be a massive global EMP but i'm not counting on that happening

so in the meantime, get used to corporations, governments, and institutions forcing AI down your throat and up your ass 😓

however in the context of OP's situation, it's not all negatives. there's your silver lining for the day

That 'maximum efficiency' comes with tradeoffs.
 
i'm going to have to feed that into GPT to figure out what that means and/or if this was sarcasm that i totally missed :confused2: i'm gonna need AI to tell me how to feel here

The FPGA gaming revolution will do this. New KICAD schematics and VHDL/Verilog sources, plus MAME drivers should all be easily digestible by an LLM.
 
already happening, it's been in motion for the last several decades. literacy rates, critical and abstract thinking, et al have been in decline throughout most of our lifetimes. AI just pushes everyone further along the same trajectory


people will have major atrophy in an basic ability to think in the coming years... just imagine an entire generation of people who never really learned anything and just looked it up on some AI.

this is much worse than the fact that we don't remember phone numbers anymore or can't drive to a friggin grocery store 7 minutes away without navigation on.
 
i'm going to have to feed that into GPT to figure out what that means and/or if this was sarcasm that i totally missed :confused2: i'm gonna need AI to tell me how to feel here
Lol. Well, in order to make new cores for FPGA systems like MiSTer, some people are taking the route of redrawing the schematics, they translating that into a hardware description language. If an LLM gets trained on that, you could reasonably ask it which chips to check for different symptoms.
 
people will have major atrophy in an basic ability to think in the coming years... just imagine an entire generation of people who never really learned anything and just looked it up on some AI.

this is much worse than the fact that we don't remember phone numbers anymore or can't drive to a friggin grocery store 7 minutes away without navigation on.
People had the same concerns with printing press, Spinning Jenny, railroads, electricity, telephone, automobile, computer, internet, robots, etc…
 
this is true but the progressive advent and proliferation of these technologies did have a pervasive effect on the human experience. to the point raised by @andrewb regarding AI, none of the institutionalizations of these were without tradeoffs

lization-the-defining-force-jacques-ellul-56-67-39.jpg

People had the same concerns with printing press, Spinning Jenny, railroads, electricity, telephone, automobile, computer, internet, robots, etc…
 
You're assuming the HDL models and MAME drivers accurately describe the hardware -- they don't.
Doesn't matter, LLMs will be able to understand the connection between the (correct) schematics and (not so correct) emulations in HDL and MAME.
 
Back
Top Bottom