Expert systems are nothing new. Chat GPT is making that reality possible. If in the process it becomes better than any other human and provides cheaper labor and a better worker, then we simple go back to the drawing board and figure out what to do with our time.
Ridiculous, this is not an expert system. It pulls crap from anywhere, and is totally unreliable.
Do you actually think that doctors could use an "expert system" of this caliber?
Or that anyone would want them to do so? How many billions of dollars in lawsuits do you think
will be filed when such an "expert system" starts killing people with bad information?
Or do you think Tesla has the right to sell people a $15,000 "full self driving mode" that kills them?
Software people seem to have no sense of responsibility, but that will change when they go to prison.
Do you actually think that doctors could use an "expert system" of this caliber?
Or that anyone would want them to do so? How many billions of dollars in lawsuits do you think
will be filed when such an "expert system" starts killing people with bad information?
Or do you think Tesla has the right to sell people a $15,000 "full self driving mode" that kills them?
Software people seem to have no sense of responsibility, but that will change when they go to prison.
Last edited:
Its rather early in the day to condemn the tech, we are more worried about
1. Whether or not the tech will see our 'value' when the tech has the upper hand. Are we able to communicate our pain and joy? Do we have a meeting ground
2. Its resilience to malware
3. Its resilience to bad updates
4. Its evolutionary path
etc etc etc
1. Whether or not the tech will see our 'value' when the tech has the upper hand. Are we able to communicate our pain and joy? Do we have a meeting ground
2. Its resilience to malware
3. Its resilience to bad updates
4. Its evolutionary path
etc etc etc
I have been making software for the past 30 years now. It’s a nasty business for the most part and I can’t wait for it to be fully automated, even if it means it’ll stop paying my bills.
All of my best software has been motivated by a desire to discover a better, more efficient way to do something that should not burden a human soul.
But the business of software is very different. It’s more about selling new software year in and year out.
As for ChatGPT, I am honestly amazed by it. I wonder why Google doesn’t work like that.
I just wish it would provide a list of references with every answer.
Finally, like it or not, it’s happening. The proverbial genie is out of the lamp and there is no stopping it.
Let’s hope that kind of tech frees us from the most soul-killing tasks, and allows us to concentrate on what machines cannot do.
Until the singularity happens.
All of my best software has been motivated by a desire to discover a better, more efficient way to do something that should not burden a human soul.
But the business of software is very different. It’s more about selling new software year in and year out.
As for ChatGPT, I am honestly amazed by it. I wonder why Google doesn’t work like that.
I just wish it would provide a list of references with every answer.
Finally, like it or not, it’s happening. The proverbial genie is out of the lamp and there is no stopping it.
Let’s hope that kind of tech frees us from the most soul-killing tasks, and allows us to concentrate on what machines cannot do.
Until the singularity happens.
Its rather early in the day to condemn the tech, we are more worried about
1. Whether or not the tech will see our 'value' when the tech has the upper hand. Are we able to communicate our pain and joy? Do we have a meeting ground
2. Its resilience to malware
3. Its resilience to bad updates
4. Its evolutionary path
etc etc etc
But not if it kills people. Right.
Before 1700 nobody died in road accident. Now a days million people die on roal every year. Every tech has its drawbacks. And people will always find a way to not care much.
So if your laptop or EV burst into flame and burned down your house, you would not sue the manufacturer
for compensation? It's just a drawback, and you wouldn't care much. Right. As long as it's someone else.
for compensation? It's just a drawback, and you wouldn't care much. Right. As long as it's someone else.
People who do just don't get it. They probably never will despite their environment being increasingly populated by such engines or the products of such engines.Its rather early in the day to condemn the tech, we are more worried about
High chance there is more advanced that isn't even shared in public.
Bummer that of all companies that could get their hands on it, MS was the one.
q: turing test
The Turing test is a measure of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. It was proposed by British mathematician and computer science pioneer Alan Turing in 1950.
The basic idea of the test is that a human evaluator engages in a natural language conversation with a machine and another human, without knowing which is which. If the evaluator is unable to reliably distinguish the machine's responses from those of the human, the machine is said to have passed the Turing test and demonstrated human-level intelligence.
While the Turing test is widely regarded as an important concept in the field of artificial intelligence, it has also been criticized as a limited and perhaps outdated measure of machine intelligence. In particular, some experts argue that a machine's ability to perform specific tasks or exhibit human-like behaviors in specific domains may be a more relevant and practical measure of its intelligence.
The Turing test is a measure of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. It was proposed by British mathematician and computer science pioneer Alan Turing in 1950.
The basic idea of the test is that a human evaluator engages in a natural language conversation with a machine and another human, without knowing which is which. If the evaluator is unable to reliably distinguish the machine's responses from those of the human, the machine is said to have passed the Turing test and demonstrated human-level intelligence.
While the Turing test is widely regarded as an important concept in the field of artificial intelligence, it has also been criticized as a limited and perhaps outdated measure of machine intelligence. In particular, some experts argue that a machine's ability to perform specific tasks or exhibit human-like behaviors in specific domains may be a more relevant and practical measure of its intelligence.
q:write c++ to convert seconds to hour,minute,seconds
ChatGPT
Here's a C++ program that takes an integer input representing the number of seconds, and converts it into hours, minutes, and seconds:
cCopy code
#include <iostream>
int main() {
int total_seconds, hours, minutes, seconds;
std::cout << "Enter the total number of seconds: ";
std::cin >> total_seconds;
hours = total_seconds / 3600;
total_seconds %= 3600;
minutes = total_seconds / 60;
total_seconds %= 60;
seconds = total_seconds;
std::cout << "The time is " << hours << " hours, " << minutes << " minutes, and " << seconds << " seconds." << std::endl;
return 0;
}
ChatGPT
Here's a C++ program that takes an integer input representing the number of seconds, and converts it into hours, minutes, and seconds:
cCopy code
#include <iostream>
int main() {
int total_seconds, hours, minutes, seconds;
std::cout << "Enter the total number of seconds: ";
std::cin >> total_seconds;
hours = total_seconds / 3600;
total_seconds %= 3600;
minutes = total_seconds / 60;
total_seconds %= 60;
seconds = total_seconds;
std::cout << "The time is " << hours << " hours, " << minutes << " minutes, and " << seconds << " seconds." << std::endl;
return 0;
}
Big deal, people have written about this for many years. There's plenty to copy.q: turing test
It appears those claiming that copying preexisting material is an incredible intellectual achievement,
are the same people who copied other students' work back in school, and thought so well of themselves
for knowing how to cheat. There really is no other explanation.
And we won't even mention all the pure BS it is designed to generate, we all know who those people are too.
Only a BS artist would actually build that into software. How'd the 737 Max work out?
are the same people who copied other students' work back in school, and thought so well of themselves
for knowing how to cheat. There really is no other explanation.
And we won't even mention all the pure BS it is designed to generate, we all know who those people are too.
Only a BS artist would actually build that into software. How'd the 737 Max work out?
rayma said: "Big deal, people have written about this for many years. There's plenty to copy."
Fundamental principles tend to get forgotten in the daily grind. I am interested in the much-hyped use of AI because it is not possible, in an automated way, to prove that a program will run forever. An academic could design a proof for a specific version of a program but this is probably more difficult than designing the actual software. And the sad academic would have to start from scratch for every program of interest.
Now we have software that is many times more abstract and complicated than in days of yore. Reliability and correctness is just a dream.
Another concern is the computing resources consumed by all these AI activities. It might become a crypto-like energy hog.
Fundamental principles tend to get forgotten in the daily grind. I am interested in the much-hyped use of AI because it is not possible, in an automated way, to prove that a program will run forever. An academic could design a proof for a specific version of a program but this is probably more difficult than designing the actual software. And the sad academic would have to start from scratch for every program of interest.
Now we have software that is many times more abstract and complicated than in days of yore. Reliability and correctness is just a dream.
Another concern is the computing resources consumed by all these AI activities. It might become a crypto-like energy hog.
Big deal, people in the field which you despise but depend on to post here, do that all the time too.Big deal (and a very mundane task), it just copied someone's already written program.
Humans must credit the prior work of others, or they will lose all credibility in their field.
Or did you not know that?
Far as I'm concerned, anyone making excuses for this total BS is "driving the get-away car"
and guilty of not telling people what they need to know.
Or did you not know that?
Far as I'm concerned, anyone making excuses for this total BS is "driving the get-away car"
and guilty of not telling people what they need to know.
Last edited:
You should see it as a good index at the end of a book. It’s what it is. A little more than that, but close.
The index will get you to the page where the information you want is located, but assimilating the knowledge is still the reader’s job and nothing can do that for the reader.
The index will get you to the page where the information you want is located, but assimilating the knowledge is still the reader’s job and nothing can do that for the reader.
As an exercise, I used ChatGPT extensively to generate a number of 'mundane' C# routines that can be tested individually.
FTP download with retry, file I/O, XML parsing and storage, binary (de)serialization etc...
Once assembled they generated fully functional, royalty free software.
A valid alternative to Stackoverflow.
FTP download with retry, file I/O, XML parsing and storage, binary (de)serialization etc...
Once assembled they generated fully functional, royalty free software.
A valid alternative to Stackoverflow.
- Home
- Amplifiers
- Solid State
- ChatGPT ideas about amplifier design