tv Inside Story Al Jazeera April 25, 2022 2:30pm-3:01pm AST
2:30 pm
finished a tour, overbearing to see how hold a dog of his whole. oh, the feeling must be in the war when you are here outside to the plains, the bombs, we have had so many decades of peace in europe. this is a very difficult situation. i hope there will be a peaceful solution to all of this. even in the heat of the cold war, berlin could only protect 27500 people in its bunkers on a population of 2100000. so the government's plan to upgrade them is seen as a way to reassure it citizens, rather than aiming to offer protection for every one step, fasten al jazeera in berlin. ah, none other than the headlines on al jazeera, you are secretary state antony blinkin has promised ukraine's forces of continued support from the white house. he met the ukrainian president in chief on sunday
2:31 pm
along with the u. s. defense secretary lloyd austin. russia is failing. ukraine is succeeding. russia has sought as his principal way to totally subjugate ukraine, to take away its sovereignty, to take away its independence. that has fail. it sought to assert the power of its military and its economy. we are, of course, you're seeing just the opposite. military that is dramatically under performing an economy as a result, the sanctions as a result of a mass exodus from russia, that is in shambles. waterproof and says the west is trying to cause division in russia in a bid to limit its territorial gains in ukraine. he's been speaking in moscow, the russian presidents also said foreign media outlets are provoking russia military doors. jabbar is in moscow with more research and they seem as her speech was that as a country need to continue to remain united and fight extremism on many fronts. according to the russian president, he said that to the general prosecutor's office must continue to be vigilant. when
2:32 pm
it comes to fighting, what he said was provocations against the russian military, and it's so called special military operation in ukraine. he said that the extremism that's coming from the foreign media must be stopped french president minute micro has been reelected. he defeated for right challenger marine, the pen, micro has acknowledged voters to satisfaction with his 1st term mass testing for over 1900 has begun and beijing largest district chow yang, china trying to track it outbreak that may have been spreading in the capital for a week the 1st verdict and the corruption cases against me and mars opposed civilian leader has been delayed access to. she was arrested by the army when it sees power a year ago. if found guilty, she could be jailed for up to 15 years. those are the headlines on al jazeera. next that's inside story. thanks for watching. bye bye. for now. ah,
2:33 pm
keep users safe or be find billions of dollars. the you sets new rules, forcing tech companies to remove illegal content. how will the regulations work? and could they limit free speech on the internet? this is inside story. ah hello and welcome to the program. i'm hammer, jim, jim. it's being held as the start of a new era for online protection. the european union has approved rules to force big
2:34 pm
technology firms such as google, facebook, and twitter to remove illegal conte it. if they don't, they can be fined billions of dollars. tech companies lobbied against the digital services act or d as a. but you politicians finalize the details on saturday. after 16 hours of negotiations. the rules outline how company should keep users safe on the internet . tech groups will also have to disclose how they tackle this information and propaganda. that effort gained momentum after rushes invasion of ukraine. it's a rat, we have the political agreement on the good service act. i have learned so much these 2 years and the agreement tonight is better than the proposal that we teach. and what we have achieved is that he's not slogan anymore. was that what is an eager assign also be seen and that was as legal online. now. now it is a real thing. democracy's back helping us to get our rights and to feel safe when
2:35 pm
we are online. the regulations will come into force in 2024. governments can ask companies to remove content that promotes terrorism, child sexual abuse, and commercial scams. social media platforms such as facebook and twitter will have to make it easier for users to flag harmful content. that will also apply to e commerce companies like amazon for counterfeit or unsafe products. companies will be banned from using deceptive techniques to get people to sign up for services. repeated breaches could see them banned from trading in the you. google responded by saying it will work with policy makers on the details. last month, e u approved separate legislation to prevent anti competitive behavior by tech companies. for example, sites like google will be banned from favoring their own services and search engine results. messenger apps, including whatsapp, we'll have to operate with smaller platforms, and it'll prevent software being pre installed on computers and phones. tech companies say that could stifle innovation. ah.
2:36 pm
all right, let's go ahead and bring in our guests from brussels, johan sparking, coordinating spokesman for the digital economy, research and innovation at the european commission from masset, catalina, go into associate professor in private law and technology at intellect, university in the netherlands and from paris. thomas vignette, partner and co chair of the global antitrust group at clifford chance law firm. a warm welcome to you all and thanks so much for joining us today on inside story. your honest, let me start with you today, just how will these regulations work and also the fact that these new laws are to be enforced out of brussels rather than through regulators and individual countries . is that going to make enforcement easier? i think 1st of all, yes, it will make it falsely easier. important to note that only the very big a platform. so if you reach 10 percent of european so that is 45000000 users, then regulation. and so it was meant to be with your pediatrician uncles. so
2:37 pm
a small person was good not b, b as in boy brussels. but it's, of course easier if here is one set of rules for all other forms in the you and also send from wasn't the city a catalina of, from your perspective, just how groundbreaking are these new rules and regulations and how, how significant is all this? i think that the especially the dsa, if we're looking particularly at the discourse of public discourse around illegal content. this is definitely drunk, groundbreaking, because the dsa is making the invisible visible. we have had a lot of rules in the past that have been applicable together with an e commerce directive, which is that now the predecessor seemed to be predecessor of the dna, the digital services act. and there has been a lot of focus given to, you know, what kind of content should be online, such as the fact that platforms should be, should this should have a lot of activities that are, that they will be responsible for in terms of taking down for instance terrorists,
2:38 pm
contact content or child pornography. so these rules have already been there before, but what the d. c is now bringing to the 4 and therefore making the invisible visible is that there is a plethora of other rules that will be a political at the same time with the traditional types of illegal content that we know. and we can see this in the definition of the legal content in the dsa currently, which says that illegal content is basically any kind of content that is a violation of national or european law thomas. so now we have the digital services act that has been approved earlier this year. the, you approve the digital market act that's meant to tackle the market power of silicon valley firms. how far reaching will these regulatory actions be? well, the digital market sec, that you just mentioned is rather revolutionary in terms of the obligations that it places upon the largest platforms, which might very well include a few european companies beyond the,
2:39 pm
the american ones that will undoubtedly be included. so there will be some very far reaching obligations on those companies to enable up interoperability. for example, with facebook, an apple will be required to allow uses of different forms of in a payment mechanisms. google will have to rank and it's search results fairly without any preference for its own search results. so quite a number of really, very serious obligations. in addition, of course, to the brand new political agreement on the dsa, which has already been mentioned, the honest these regulations the, this is all very, very big in scope of how, how will it be insured that tech companies are going to police themselves? yeah, look, i mean, i would like to add maybe one thing to what has that because the, the is a, is not only looking at the content, but also it at home for content. so think about, for example,
2:40 pm
content that promotes or kind of, you know, can increase ease like eating disorders. for example. you know, it is kind of like young people following following the feed and that may be inclined to, i don't know, the eating. that is something that is not a good, but it is harmful to people who are young people who follow this. and so you will laugh or not, you die forms. we do need a risk assessment every year. and then the risks that identified that the need to be mitigated and this way then be controlled by the commission. and of course that would be fine. so if somebody finds, can go up to 6 percent of those with over in it. i mean, in extreme cases that would even be a been off a certain platform. so that is basically the measures a right to start and really go into the risk assessment and seeing what i was going to define. how are they mitigated and is enough to be done or not? you 100, let me just follow up with you regarding something you said you're talking about
2:41 pm
harmful content. who gets to decide what is considered harmful content? i mean, i realize that there are a lot of details that still need to be worked out going forward, but, but who's going to be the arbiter on this? he look, i mean one thing is of course illegal and then does have been easy to identify because as, as, as i said at the law, now if it comes to half the content, it is really 40 and 40 in the for the bathrooms and says to look into what is their, their profiles, what is harmful of what can be harmful in terms of their content. so they are like 4 categories that need to be assessed beyond the content. so for example, restrictions to freedoms or harmful content that could for example, effect minus or know this information. you know, so it, this, but it really depends. i mean, it's a platform that it is a home for the home. it was a different, a risk profile. then for example, and on the marketplace where it's more, maybe about count,
2:42 pm
if you good that may not be let's say a dangerous but still not compliant. you know? so it really depends on the us, on those, those to see what is the risk profile that i have as a platform. and how do i need to mitigate this was a catalina, i saw you reacting to a lot of what you're hundreds were saying there. so i'm going to let you jump in, but i also want to ask you the same question. i mean, who gets to ultimately decide what is deemed to harmful content and does that bring up freedom of speech concerns as well? absolutely. so the answer to your question actually is going to determine the success of the dsa because i completely agree with your hon. and so there are a lot of different areas or different types of content that the dsa covers. so illegal content, you can look at this also from a very, from a very technical perspective and you can see, okay, is there a provision at national are european on a level that is going to say, you can't have advertising to words miners under specific age. and if that exists, for instance, if we look at the audio visual media services direct them,
2:43 pm
then immediately if type of content is going to be against this provision, it can be deemed in eagle. then we also have the system, this, the, the systemic risks. so if a platform through its architecture, through the fact that it amplifies types of content, that could have an impact on decision making on the well being of users or on a lot of other aspects such as democracy. touching on what you were mentioning, freedom of expression and also different political advertising a freedom, then it can be a matter of really trying to investigate. and this is where the procedural aspects of the dsa come very much in hand to investigate what exactly are the systemic rests and how can we own them in the main problem. here are the key word here is going to be data access. and i really hope that in the enforcement of the d. s. a and also in the, in implementation. so 1st of all, in the design of the infrastructures that are going to have to implemented, there will be a sufficient,
2:44 pm
sufficient attention paid to the fact that data shared by platforms is not necessarily reliable. so we have a little bit of a catch $22.00 situation. we have seen this before when facebook tried to share research data, and we've seen that there were mistakes made. and a lot of research was being completely inaccurate because of the fact that facebook data simply was not good is catalina, it sounds like what you're describing, you know, is rather complicated as far as, you know, setting up procedures in order to be able to implement this i mean, do you foresee this going smoothly or do you think this is going to get rather messy? and i had very much depends on also the coordination of all of the stakeholders that have already powers at investigation. so for instance, if you look at the national consumer protection authorities, because consumer protection has been a very massive concern and also for the dsa, you 100 was already mentioning the fact that you can have various types of content . for instance, i call accounting for counterfeit goods, or sometimes also considered to be products that platforms need to look at. and
2:45 pm
then remove also from a consumer protection perspective, not just the until and the collect your property perspective. but the problem here is that it very much depends on how all of these organizations that have a stake in the implementation of the d. a and in the enforcement of prior regulation will come together. if we take, there has been a lot written about the political compromise around the dsa. and for instance, targeted advertising that that actually is meant to target miners. so who is the minor in which country do we have? which standard? who is responsible is it advertising is a data protection, a lot of authorities nationally need to come into this web of not only substances rules, but also enforcement to figure out how exactly to divide or to to coordinate the space of the digital market. because otherwise, there will be a cannibalization of enforcement. thomas from your vantage point,
2:46 pm
how do you think tech companies are going to deal with these new regulations? do you expect that these new you regulations are going to face a lot of legal challenges by big tech? well, one could suggest that perhaps the death declarations of victory that had surrounded this political agreement might be a bit premature in the sense that it seems to me at least that there is a great deal that remains to be seen about how the d. s. a is in forest. how harmful content as you suggest is, is, is defined and, and how these new rules will be implemented by the, the tech companies. i do find that there is quite a lot of vagueness in, in the new proposed loc as it's not yet fully adopted. and, and it's going to be difficult, i believe, for tech companies to know exactly what they have to do and what they may not
2:47 pm
do so. so i think there's a great deal of uncertainty about how this is how this actually pens out. it will require a lot of good faith on the part of everybody involved and we'll see whether that good faith, materializes and thomas just to follow up when you're talking about, you know, this great deal of uncertainty that surrounds is, are you talking specifically about the digital markets act of the digital services act or, or both. i'm talking about both, but it does seem to me and i must say that my areas, antitrust that the digital markets act is the one that i am much more deeply involved in. but it seems to me that the digital services act for almost by design it seems, incorporates a great deal more uncertainty. the digital market sack will also have a great deal of questions surrounding it about what it means and what the companies who are gatekeepers have to do what they may not do. you want to see how much momentum did the effort to pass this legislation gain from the fact that there is
2:48 pm
so much this information out there with regard to the war in ukraine and also that there's so much this information out there when it comes to code. 19 yeah, i think the development digital highlights and they also highlight the need to do something and they have influenced a little bit the negotiations in a way that incorporated the crisis mechanism. because one thing is that you have regular, complex lex changes and these animals. but another thing is that you have a situation that you cannot preempt, like now the more your crane and to which you need to react a form v as a regular to electrons than what doing what they are seeing and what they do to mitigate the risk and, and this is why it is quite as much as most incorporated now to, to allow these kind of talk conversations and you know, operation on these issues. but if i mean at another point, because you mentioned the fema expression, it is precisely there to ensure the freedom of expression because the one thing is
2:49 pm
to remove another thing is to one over remove. so this is why, for example, the terms of use need to be very clearly written to understand what is out if you can understand why content this is true. and they also need to have a means to address in the future to very easily say, i think there's a least reinstated content. so, and i think what you want to do here is to really give and this is what it was, i think also, you know, it means that democracy in your letters do not use this ab do to dis, question only to the forms but also to governance framework which governs the way this works. honestly, this is very, very wondered if i can make a lost point to what thomas, that on the, on, on base it was, or maybe not. you find a lot of with this is roger. now go because it is we future group. i mean, yes. nowadays, certain number of homes that are not existing that say 10 years ago or 5 years ago or, you know, if you do not know what the next is to 5 or 10 years from now,
2:50 pm
are they? so you need to have laws that are, is a huge proof, the e commerce, direct, the image is 20 years old. so you see a long time. so it needs to be a groove and the cover also. you know, the emotional and future. thomas, i saw you nodding along to some of what your harness was saying there, look like you want to react. please go ahead. yeah, i agree completely. that precisely defining, for example, what constitutes harmful conduct is really just not possible. because new kinds of harmful con, con, do contra harmful context tent will arise and it needs to be future proof. so i don't disagree with a toll with that. but the fact that the, the con, the content of harmful content is so uncertain, is going to require a great deal of good faith on every one side in order to make this work. it seems to me, catalina, you know, the articles that the reporting about,
2:51 pm
about what's in the d. s a. a digital services act says that it's going to allow people to be able to choose how content is presented to them. i want to ask you, is it actually going to allow people to, to stop algorithmic profiling. that is set up, you know, to get more engagement. are they actually going to be able to opt out of that? so i think that the, there hasn't even been a lot of public discourse around france. it's targeted advertising and full time for instance, with an m e p, who really has been speaking about this at length. we what we, what we can already see is that, that a lot of platforms are already creating parallel recommend, or systems. so for instance, instagram and twitter even take talk have that right now. there are definitely deaf a different types of, for instance, for you pages or different types of feeds. and then some of them are based on more, perhaps of the time or, or data related factors. and some of them are based on,
2:52 pm
for instance, the people who you follow in a platform. so from that perspective, you could argue that already some, there are being some, some changes are already being very visible. now the thing is that one of the major problems that i think the d s a perhaps can start to tackle. but it is so complex that, that we still need a lot of other solutions is the fact that the data broker market is very or peak. so do we even have a map of really understanding which companies are using the type of data that is being collected also by social media platforms? which types of platforms or marketplaces are not using that? how. what is the percentage of the platforms that use that? so all of these questions remain still to be answered because the pace in which the online and the digital economy is developing is incredibly fast. and this is why i completely agree with what you 100 was mentioning,
2:53 pm
that we need teacher proof regulation. the question is, how do we draw the line between not having too little content in context and having too much as thomas was mentioning? because otherwise, sometimes we deal with legal uncertainty even when future regulate future proof regulation is there. and the example that comes to mind as the unfair commercial practices direct that has been and is still a can very nicely tackled dark pattern, which is another type of a user manipulation that we see very much quoted in the. a public discourse around the essay. but if you look at the definitions in article 5, for instance of the cpd and you see tests that are very, very vague, like an unfairness test. you still rely, need to rely on a judge to interpret that. and if you don't have clear interpretations for a very specific situation, then the market doesn't know what to do with that. and we see that very clearly with dark patterns, catalina, i also want to ask you for a moment about the another law that you enact the data privacy law called the
2:54 pm
general data protection regulation. you know, there's been a lot of criticism that it has not been enforced vigorously enough. are you concerned that the same thing might happen with the digital services act? i definitely see that we're moving into an era of digital enforcement, digital mart monitoring, and also market surveillance. done not only by the data protection authorities, whether at european or national level that are designed and, and have the mandate to implement and enforce the g d p. r, but also with a lot of other agencies and a lot of other authorities. so i think that enforcement is going to be perhaps we're going to see more and more enforcement in the future. the, the question that i would like to ask, and also perhaps to, for this discussion, but also for other debates is how do we ensure that we have coherent enforcement across all of these agencies that are stakeholders in the digital market. and that is the $1000000.00 question. you're 100. there are tech companies who say that
2:55 pm
these regulations could stifle innovation. what's your reaction to that? so if i could just quickly come back to the enforcement because it's very different yet as you and for state of protection which is enforced, not an international level, but also regional, for example, in germany as to regions that aren't charged offices. or if you have several enforcement, and that comes back to your 1st question that these a would be enforced by the forms, by 45000000 users by the commission. and it's at the same is true for the dna and you would just be so, and that is completely different situation. and, and i would also like to add one point of the targeting, guess what did you say foresees is that you have an option to indeed opt out and say, no, i do not want to be targeted. and they want to recommend a system that is not based on target. now you can of course say it is not very useful. so it was we, by the choice with the consumer itself was we have to bend for a target of us. it wasn't for minors and for, for sensitive data. now, when your question, the point of it is
2:56 pm
a is to introduce obligations that converts last. so size matters. if you're a small company and you actually find it easier because you don't have to hold the big debate at the same obligations of the big, the big platforms. there's also one aspect, i mean, it is very important exactly that you've lost that innovation, that this is exactly what this, what you say, you may, in the way it was the jeff locations for the players. then for small ones and the d . my for example, gives rights to smaller competitors, for example, use market places to sell the products to have access to that data. which date nowadays to have. so, and it was actually protect the small company at thomas. we have a little less than a minute left here. let me just ask you very quickly from your vantage point, is that you have enough manpower to actually be able to monitor and enforce the digital services act and the digital markets act. that was exactly the point i wanted to just raise both of these pieces. legislation together mark
2:57 pm
a huge increase in the amount of regulation in the tech tech sphere. and, and i tend to doubt that there will be sufficient resources deployed by the european commission to be able to effectively enforce them. all right, well we have run out of time, so we're going to have to leave the conversation there. thanks so much. all of our guests. johan is market cut, alina, go on to, and thomas vinny, and thank you for watching. because see the program again any time visiting our website, al 0 dot com. and for further discussion, go to our facebook page. that's facebook dot com, forward slash ha inside story. you can also join the conversation on twitter. i handle is at ha, inside story. mean, how much room in the holding here and uh huh. bye for now. ah
2:58 pm
and the climate has changed every year for millions a year, decades of talk. but little action is all about distract, create confusion to craig, smoke and mirrors the shocking truth about how the climate debate has been systematically supported. the oil industry was a made bank roller for opposition to contact the campaign against the climate. do you think that's a bad thing more to to and that was here's a good thing. absolutely. on august eve, a whole ah
2:59 pm
ah, the story goes that the statue of an ancient greek god, he beneath the waves for millennia. until a palestinian fisherman on earth, the priceless relic. the story continues, that as the world's attention was drawn to casa mysteriously, the deity disappeared once again. the apollo of casa, on a jessia talk to al jazeera, we all see what is the time table in your mind. when do you think you can be off russian gas? we listen or, and i have seen and played for with these refugees. i look at them and they're happy. they're smiling. we meet with global news makers, i'm talk about the story stuck matter on al jazeera. we understand the differences
3:00 pm
and similarities of culture across the world center might have where you call home will. but you can use in current affairs that matter to years. ah. ready ready ready no, i'm truly back to brindle. how with the look at, i mean stories on al jazeera, u. s. secretary of state antony blank in has promised ukraine's forces continued support from the white house. he met the ukrainian present in cave on sunday, along with us defense secretary lloyd austin. old men praised ukraine's efforts to defend itself and said, rushes efforts were failing. russia is failing. ukraine has succeeded. russia has sought as its principal way to totally subjugate ukraine to take.
34 Views
Uploaded by TV Archive on