On Tuesday, lawmakers in the United States questioned Alphabet Inc, Facebook Inc. and Twitter Inc. about how user content and data are being shared on their platforms the kind of senators that a particular senator believes can be easily misused.
Senator Ben Sasse, who is a part of the Senate Judiciary Committee’s panel on Privacy, Technology and the Law, stated that these platforms have become nothing short of “poisonous echo chambers”. His comment has come in the wake of the committee members examining the algorithms these platforms use. These algorithms play an important role in determining how user-generated information is broadcasted and who gets to have access to it.
According to Sasse, who hails from Nebraska, like any other new technology algorithms involve “costs and benefits” that can be easily misused or abused by those who know how to play around with it. He believes it will have a lot of dangerous effects in the long run.
The hearing was a result of Congress considering different ways to renovate Section 320, a provision of the communications law drafted in the year 1996. This provision of the law is aimed at protecting internet-based companies from any liability that might occur as a result of the way user content is shared. One House proposal will lead towards making social networking platforms assume the responsibility for the manner in which content is broadcasted and amplified with the help of algorithms.
Democrat and the subcommittee’s chair Chris Coons, while opening the hearing, remarked that he hopes to use this opportunity to get a better understanding of how the algorithms of these companies work and what are the things that could have been done to bring down the kind of algorithmic amplification that is hazardous.
Illinois Senator Dick Durbin has given directions to social media companies to take be more vigilant and take effective steps to eliminate harmful or malicious content from their platforms. He gave an example of the attack on the U.S Capitol on January 6 to emphasize the need for it. He stated that domestic extremists had shared false information using some of these platforms time and again and that is quite dangerous.
Monika Bickert, vice president for content policy at Facebook, testified that the tools their platform has, make their algorithm come across as more transparent. This helps the users understand why certain posts or information are broadcasted on their news feed.