What a rare and healthy opinion to have about him. Wish I heard that more often.
I used to like him too for his ideas, but then he became a massive Twitter troll and I started very much disliking him. And that hasn't changed until today.
What a rare and healthy opinion to have about him. Wish I heard that more often.
I used to like him too for his ideas, but then he became a massive Twitter troll and I started very much disliking him. And that hasn't changed until today.
And that is the point.
It sounds stupidly simple, but AIs in itself was the idea to do the learning and solving problems more like a human would. By learning how to solve similar problems, and transfer the knowledge to a new problem.
Technically there's an argument that our brain is nothing more than an AI with some special features (chemicals for feelings, reflexes, etc). But it's good to remind ourselves we are nothing inherently special. Although all of us are free to feel special of course.
Well what an interesting question.
Let's look at the definitions in Wikipedia:
Sentience is the ability to experience feelings and sensations.
Experience refers to conscious events in general [...].
Feelings are subjective self-contained phenomenal experiences.
Alright, let's do a thought experiment under the assumptions that:
AI works by telling it what information goes in and what goes out, and it therefore infers the same for new patterns of information and it adjusts to "how wrong it was" to approximate the correction. Every feeling in our body is either chemical or physical, so it can be measured / simulated through data input for simplicity sake.
Let's also say for our experiment that the appropriate output it is to describe the feeling.
Now I think, knowing this, and knowing how good different AIs can already comment on, summarize or do any other transformative task on bigger texts that exposes them to interpretation of data, that it should be able to "express" what it feels. Let's also conclude that based on the fact that everything needed to simulate feeling or sensation it can be described using different inputs of data points.
This brings me to the logical second conclusion that there's nothing scientifically speaking of sentience that we wouldn't be able to simulate already (in light of our assumptions).
Bonus: while my little experiment is only designed for theoretical possibility and we'd need some proper statistical calculations to know if this is practical in a realistic timeframe already and with a limited amount of resources, there's nothing saying it can't. I guess we have to wait for someone to try it to be sure.
Wow what a stupid title. It's straight up a lie.
Can someone double check the content of the quote? I'd need documents to believe this because that's a big claim.
Mhm I have a feeling they got a point
Kind of. Iirc it's a very controversial practice and whenever the police pulls it out in a public case it gets protested again (for good reason). Also, even if the practice is legal right now, there's a lot of limitations to it. Obviously it's nudging the ethical boundaries of police work either way.
https://www.propublica.org/article/cigna-pxdx-medical-health-insurance-rejection-claims
Considering this article talks about automed denial without a case review by doctors, most definitely.
Customer Service Agent.
Longing. It's fascinating to me how crushing that feeling can be, even though it's not negative or even heavy in itself.
a lot of my coworkers are religious and have a foreign background
I think this where the bias settles in that he wants to remove.
Idk if this is a hot take, but imo the war in Ukraine is pretty clear city while the Palestinian and Israeli conflict his an infinite list of wrinkles and nuances.
It's far less controversial to say the former is Russia's fault than it is to say the latter is either Palestine's or Israel's fault.