Hence neither do the above-mentioned humans...I am well aware of that. However, we should not attribute creativity to the software when it just copies,
or to humans either.
Nit-picking on 'generation' and 'creativity' here misses what's important in the existence of something like chatGPT completely, much like the people who were saying back in the day 'AI still cannot beat humans at chess' and making 'conclusions' based on that at the time.
If one deprecates and discourages creativity, where will the new code come from for the GPT to copy?
Or will it become self-aware?
This is a lot like the inability of many software guys to understand hardware, so they think it's just trivial.
Hilarious. And how's that "full self driving mode" coming along, how many were killed today?
Or will it become self-aware?
This is a lot like the inability of many software guys to understand hardware, so they think it's just trivial.
Hilarious. And how's that "full self driving mode" coming along, how many were killed today?
Last edited:
ChatGPT is not a lot more then the next step past Alexa, Siri etc....this is an advanced internet search with the response in prose. The unvetted responses still needs a subject matter expert making this a (time wasting?) toy for most applications. Now perhaps it has a place for college students who are to lazy to write there own essays (this should lead to an arms race with the plagiarism detectors).
A natural sounding voice would make ChatGPT seem truly magical, the illusion that it is more then a differential engine will lead to more deceit then progress IMO
A natural sounding voice would make ChatGPT seem truly magical, the illusion that it is more then a differential engine will lead to more deceit then progress IMO
A natural sounding voice would make ChatGPT seem truly magical, the illusion that it is more then a differential engine will lead to more deceit then progress IMO
The entire software industry is based on deceit, so not unexpected. And educators already are restructuring
courses so that students cannot use such crutches to cheat themselves out of an actual education.
Just because you can use Google does not mean that you know anything.
We should consider GPT for what is it: just a tool. The fact that it writes english with a better syntax than me (and than many of us) makes it fun to use.
If someone just copies code without understanding, they are failing to perform due diligence.
And I actually know how to bake bread and grow vegetables, if I choose. Not cattle though, since I am a vegan.
And I actually know how to bake bread and grow vegetables, if I choose. Not cattle though, since I am a vegan.
Ok I will use quotation marks for chat gpt content. Chat GPT is a collection of systems that mimic the early stages of the creation of parts of a brain, once they get it right the landscape will change. At the beginning it will just be a dictionary that is queried by natural language processing but add a few more scaffolding and it will be limitless.
Chat GPT is a collection of systems that mimic the early stages of the creation of parts of a brain, once they get it right the landscape will change.
That's a highly dubious supposition. Why do software guys think they know how the brain works, when no one else does?
That's a highly dubious supposition. Why do software guys think they know how the brain works, when no one else does?
Answer this question for yourself: 'Do we need to know how something works to build something else that works like it?'
Read "On Intelligence" by Jeff Hawkins. He defines intelligence as the ability to make predictions about the future. The AI systems do not seem to be close.
Ed
Ed
Answer this question for yourself: 'Do we need to know how something works to build something else that works like it?'
Of course we do. Unless you have stolen all the IP and documentation from those who actually do understand.
And even then you still have to understand all of it. Which, frankly, doesn't seem very likely.
Read "On Intelligence" by Jeff Hawkins. He defines intelligence as the ability to make predictions about the future. The AI systems do not seem to be close.
That's because no one actually knows the future, and so there is nothing for the AI systems to copy.
Judging from some of the comments here, I think we are all doomed.
Last edited:
Wrong. We don't anymore.Of course we do.
And even then you still have to understand all of it.
You are unprepared. Are you afraid of AI or something?
Even artists - who are the embodiments of 'creativity' - are embracing AI tools to 'create' new art.
In theory, patterns can be recognized to make predictions about the future with enough accuracy. The brain is believed to work that way. The problem is more of scale than approach.That's because no one actually knows the future, and so there is nothing for the AI systems to copy.
Ed
Wrong. We don't anymore.
You are unprepared. Are you afraid of AI or something?
Even artists - who are the embodiments of 'creativity' - are embracing AI tools to 'create' new art.
Ridiculous, I suggest that you go out and get some real world experience, and come back in 10 years and say that.
BTW, anything can be art, if they call it art. No clue.
If you write software, you know very well that it is composed of a number of atomic routines each doing a very simple thing.
These routines can be combined in a large number of ways to achieve novel tasks never done before by existing software.
All these routines are rarely written by the same person. Skilled programmer(s) tailor original software by integrating old and new routines. Still a human process at this stage.
I find personally that ChatGPT is particularly well suited to 'generate' atomic routines quickly. Their originality is not relevant especially when prototyping.
These routines can be combined in a large number of ways to achieve novel tasks never done before by existing software.
All these routines are rarely written by the same person. Skilled programmer(s) tailor original software by integrating old and new routines. Still a human process at this stage.
I find personally that ChatGPT is particularly well suited to 'generate' atomic routines quickly. Their originality is not relevant especially when prototyping.
Ridiculous, I suggest that you go out and get some real world experience, and come back in 10 years and say that.
BTW, anything can be art, if they call it art. No clue.
I don't think you understand the current state of AI, nor how it works. How about you get real world experience with that: go learn to code it then come back and chat. It doesn't take 10 years - at least it didn't take me that long.
Besides, you are stuck on words like 'creativity' and 'generation' - won't do you a lot of good when you can't distinguish the thing produced by an AI from the thing produced by a human, does it?
- Home
- Amplifiers
- Solid State
- ChatGPT ideas about amplifier design