With the help of today’s new AI technology, it has now become possible to create such ‘digital humans’ who can talk like a person even after his death. These are called ‘Deathbots’. It is a kind of computer program which can sound exactly like that person, can look like him and can also respond to messages in the same manner. To make it, old photos, videos and voice of that person are used.
Often family members get it built in memory of someone close to them. But now such companies have also come up where you can get your own ‘digital lookalike’ created while you are alive, so that your family members can talk to you after you are gone.
How does this work?
It has now become quite easy to create an AI ‘digital lookalike’ of any person. For this, first of all you have to create your account with a company providing such service, where you are asked many questions related to your likes, dislikes and your thoughts. After this you record your memories and stories in your voice and also upload your photos and videos. Using all these things, AI software creates an exact digital copy of you. Then after you leave, when your family informs the company, they can talk to that digital version of you just like a real person. The thing to note here is that in this, it is not being brought back without anyone’s consent, rather you are giving your data to the company on your own free will so that it can create this digital form of you.
Legal implications and rights
As this technology is growing, many legal complications are emerging for which there is no clear answer at present. The biggest thing is that the law does not consider your voice or your personality as your ‘property’, that is, you do not have the same right on your identity as you have on land or house. Apart from this, you have the right on the photo or voice that you record yourself, but it is difficult to get copyright on the things that AI creates on its own because in the eyes of the law, it was not created by a human being but by a machine. There is also a big fear that if that company closes in the future or the technology changes, what will happen to that digital form of yours? Will that ‘digital lookalike’ of yours also be lost forever? If this happens, your family may once again have to face the same grief that they faced when you left.
Ethical Risks and Challenges
There are not only legal but also many social and mental dangers in creating a digital doppelganger. The biggest problem is that AI runs entirely on computer programs, so over time it may start saying things that are completely different from your real thinking or nature. Apart from this, there is also a fear that instead of getting out of your memories, the housemates may become so addicted to that AI that they may never be able to return to normal life. Also, if tomorrow that AI says something that hurts the family or creates a dispute, then who will be responsible for it? This too is not clear yet.
