You are viewing the chat in desktop mode. Click here to switch to mobile view.
X
Lecture on 'AI in Journalism' by Dr. Charlie Beckett
powered byJotCast
Sudhanshu Sanwal
5:28
Replying to the concerns stemming from algorithm bias, Charlie voices out how the algorithm has to be checked in the initial stage. He also addresses the human bias and the related abominations. He says: "We are gonna need less reporting on robots taking our jobs and more on the good it brings."
5:29
Beckett further adds that how blaming the social media for Trump winning the US presidential elections in 2016 is a bias in itself.
5:30
He elaborates his point by saying, "The machines are very good at knowing what you like. So in that sense, they don't have emotions but they are driving emotions. In good ways and bad ways."
5:31
Is the general audience wary of AI seeping into our their lives?

Yes, very (55.6% | 5 votes)
 
No. They are embracing it (44.4% | 4 votes)
 

Total Votes: 9
On the aspect of emotions in journalism, Beckett says: "Not all emotions are bad. Emotions can be good and its a vital part of journalism."
He asserts: "How do you quantify emotions? It is very difficult. But the algorithm is going to help in understanding how emotions work in the media."
5:35
5:36
Beckett believes that AI can help journalists as it can help in adding value to the judgement. He asserts: "Here is my thesis. I think filter bubbles are a really good thing. Media doesn't create them. We have always had them. [And]Journalism has always been about novelty."
5:38
Talking about personalized news rooms in details and he shares his insights by saying, "However, there is a different problem: which is a closed group of people who do not want to refer to anyone else. Its a different kind of problem."
5:39
5:40
Further talking about News organisations and biases, Charlie says that accountability is always there in some ways. The reader's rights and rules are there.
5:42
Talking about accountability, Beckett says: "Technically news org are supposed to be accountable. There is all kinds of accountability in terms of money. But its a the other way around. They are far more worried about inaccuracies of the robots, rather than worrying about their journalists."
5:43
Taking into consideration the Indian context, Charlie believes that more efforts can be made to handle the current situation because technology doesn't have any power on its own. He adds: "You can use AI to build weapons of mass destruction if you want. Technology doesn't have a sense of morality. People decide how to use the technology."
5:47
On that note, the session draws to a close.
Thank you all for tuning in for this lecture. Have a great day ahead.
Connecting…