tv Inside Story Al Jazeera April 25, 2022 3:30am-4:00am AST
3:30 am
used to see this framed happening are all across the african continent. it's estimated it will cost billions of dollars to fix the damage caused by the latest gliding in south africa and environmental groups. a severe weather has killed half a 1000000 people in southern africa. in the last 6 months alone, for me to mila al jazeera johannesburg, south africa. ah! what you deserve me sell robin. reminder of all top stories, french president, monroe mackerel, has been reelected comfortably beating far right challenge at mckinley pen back, rose, acknowledged to satisfaction with his 1st term know the local. but i know for many of our compatible to chose the extreme rights, the anger and disagreement which lead them to vote for that project. well, we must find a response that will be my responsibility and the responsibility of those who
3:31 am
surround me. because the vote of this election means we have to consider all the difficulties people have lived through and to respond to their anger and disappointment. my dear friends, today you made a choice for humanism, for ambitious independence for our country, and for all europe in a concession speech by him, the pen said the result puts her party in a strong position for upcoming legislative elections. of that, and that's why they mean what we've been declared dead a 1000 times and a 1000 times. history has proven wrong, though, if you predict or wish for our demise in this to fees. i can't help feeling hopeful this result is proof of a great defiance on the part of the french people on that our leaders in france and europe cannot ignore its proof of a widely shared wish for a great change. the french people have express tonight a wish for a strong opposition against them on you will not call one that will continue to defend and protect them. there have been, i states protest by far left group against micro victory, right?
3:32 am
least moved in amuse tegan to disperse, demonstrated in paris. there were similar scenes in the northwest density of n and trench. please also shot that to people in central paris while they were driving a car against the flow of traffic on the boldness. when the vehicle sped towards offices, in other news, alexis is slovenia, have delivered a shock and a new prime minister, the environmentalist party of political newcomer robert globe, one of the most votes, and he's expected to form a coalition government, ukrainian official, say the u. s. secretaries of state and defense are in the capital. keep meeting with president vladimir zalinski. the white house hasn't confirmed the trip, which would be the 1st high level visit by u. s. officials since russia began its invasion of ukraine. of course, you can follow all of those stories on our website by logging on at al jazeera dot com is updated throughout the day. i'll be back with more news in half an hour next on al jazeera. it's inside story to stay with us. ah,
3:33 am
keep users safe for we find billions of dollars. the you sets new rules, forcing tech companies to remove illegal content. how will the regulations work? and could they limit free speech on the internet? this is inside story. ah hello and welcome to the program. i'm hammer, jim, jim, it's being held as the start of a new era for online protection. the european union has approved rules to force big technology firms such as google, facebook,
3:34 am
and twitter to remove illegal conte it. if they don't, they can be fined billions of dollars. tech companies lobby the against the digital services act, or d. s a. but you politicians finalize the details on saturday. after 16 hours of negotiations. the rules outline how company should keep users safe on the internet . tech groups will also have to disclose how they tackle this information and propaganda that effort, gain momentum after rushes invasion of ukraine. it's iraq. we have the political agreement on the good to services act. i have learned so much these 2 years. and the agreement tonight is better than the proposal that we teach. and what we have achieved is it is not slogan anymore. was that what is an eager assign also be seen and that was as legal online. now. now it is a real thing. democracy's back helping us to get our rights and to feel safe when
3:35 am
we are online. the regulations will come into force in 2024. governments can ask companies to remove content that promotes terrorism, child sexual abuse, and commercial scams. social media platforms such as facebook and twitter, we'll have to make it easier for users to flag harmful content. that will also apply to e commerce companies like amazon for counterfeit or unsafe products. companies will be banned from using deceptive techniques to get people to sign up for services. repeated breaches could see them banned from trading in the you. google responded by saying it will work with policy makers on the details. last month, e u approved separate legislation to prevent anti competitive behavior by tech companies. for example, sites like google will be banned from favoring their own services and search engine results messenger apps including whatsapp. we'll have to operate with smaller platforms, and it'll prevent software being pre installed on computers and phones. tech companies say that could stifle innovation. ah.
3:36 am
all right, let's go ahead and bring in our guests from brussels. johan a sparky, coordinating spokesman for the digital economy, research and innovation at the european commission. from maslick, catalina, go into associate professor in private law and technology at outlet university in the netherlands and from paris, thomas, vignette, partner and co chair of the global anti trust group. at cliff for chance, law firm a warm welcome to you all and thanks so much for joining us today on inside story. your honest, let me start with you today. just how will these regulations work and also the fact that these new laws are to be enforced out of brussels rather than through regulators and individual countries. is that going to make enforcement easier? i think 1st of all, yes, it will make unfortunately easier. it's important to note that only the very big a platform, so if you reach 10 percent of european, so that is 45000000 uses, then regulation. and so it was meant to be with the commissioner. so as more of us
3:37 am
was good not be is to provide for brussels, but it's of course easier here. this one set of rules for all other forms in the you and also suffer wasn't. is it a catalina of, from your perspective, just how groundbreaking are these new rules and regulations and how, how significant is all this? i think that the, especially the dsa, if we're looking particularly at the discourse, the public discourse around illegal content. this is definitely drunk, groundbreaking, because the dfcs making the invisible visible. we have had a lot of rules in the past that have been applicable together with an e commerce directive, which is that now the predecessor seem to be predecessor of dsa, the digital services act. and there has been a lot of focus given to, you know, what kind of content should be online, such as the fact that platforms should be, should, should have a lot of activities that are, that they will be responsible for in terms of taking down, for instance,
3:38 am
terrorist contact content or child pornography. so these rules have already been there before, but what do you say is now bringing to the 4 and therefore making the invisible visible is that there is a plethora of other rules that will be a political at the same time with the traditional types of illegal content that we know, and we can see this in the definition of the legal content and the dsa currently which says that illegal content is basically any kind of content that is a violation of national or european law thomas. so now we have the digital services act that has been approved earlier this year. the you approved the digital market act that's meant to tackle the market power of silicon valley firms. how far reaching will these regulatory actions be? well, the digital markets act that you just mentioned and is rather revolutionary in terms of the obligations that it places upon the largest platforms, which might very well include a few european companies beyond the,
3:39 am
the american ones that will undoubtedly be included. so there will be some very far reaching obligations on those companies to enable up interoperability. for example, with facebook, an apple will be required to allow uses of different forms of in a payment mechanisms. google will have to rank and it's search results fairly without any preference for its own search results. so quite a number of really, very serious obligations. in addition, of course, to the brand new political agreement on the dsa, which has already been mentioned, the honest these regulations the, this is all very, very big in scope of how, how will it be insured that tech companies are going to police themselves? yeah, look, i mean, i would like to add maybe one thing to what has that because the, the is a, is not only looking at a content, but also it at home for content. so think about,
3:40 am
for example, content that promotes or kind of, you know, can increase ease like eating disorders, for example, thought, you know, it is kind of like young people following following the feed and that may be inclined to, i don't know, the eating. that is something that is not a good, but it is harmful to people who are young people who follow this. and so you will have a few forms. we do need a risk assessment every year and then the risk that identified that the need to be mitigated. and this way then be controlled by the commission. and of course that would be fine. for example, i mean fighting go up to 6 percent of those with over in it. i mean in extreme cases that would even be an officer. so that is basically a right to start and really go into the risk assessment and seeing what was going to define, how are they mitigated and is enough to be done or not? you 100, let me just follow up with you regarding something you said you're talking about
3:41 am
harmful content. who gets to decide what is considered harmful content? i mean, i realize that there are a lot of details. i still need to be worked out going forward, but, but who's going to be the arbiter on this? hello. i mean, one thing is of course legal and then does have an easy to get into 5 because this is kathy, i said at the find the law. now if it comes to half the content that is really $40.40 in the for the bottom says to look into what is their, their profile, what is harmful of what can be harmful in terms of their content. so they are like 4 categories that need to be assessed beyond the content. so for example, restrictions to freedoms or harmful content that could for example, effect minus or know this information. you know, so it, this, but if you depends, i mean, it's a platform that it is a part home. it was a different, a risk profile. then for example, an online market place where it's more, maybe about called a few goods that may not be, let's say,
3:42 am
a dangerous but still not compliant. you know? so it really depends on the, on the service to see what is the risk profile that i have as a platform. and how do i need to mitigate this was a catalina, i saw you reacting to a lot of what you're hundreds were saying there. so i'm going to let you jump in, but i also want to ask you the same question. i mean, who gets to ultimately decide what is deemed to harmful content and does that bring up freedom of speech concerns as well? absolutely. so the answer to your question actually is going to determine the success of the dsa, because i completely agree with your hon. and so there are a lot of different areas or different types of content that the d s. a covers. so illegal content, you can look at this also from a very, from a very technical perspective and you can see, okay, is there a provision at national our european level that is going to say, you can't have advertising to words miners under specific age. and if that exists, for instance, if we look at the audio visual media services direct them,
3:43 am
then immediately if type of content is going to be against this provision, it can be deemed in eagle. then we also have the system, this, the, the systemic risks. so if a platform through its architecture, through the fact that it amplifies types of content, that could have an impact on decision making on the well being of users or on a lot of other aspects such as democracy. touching on what you were mentioning, freedom of expression and also different political advertising a freedom, then it can be a matter of really trying to investigate. and this is where the procedural aspects of the dsa come very much in hand to investigate what exactly are the systemic rests and how can we own them in the main problem. here are the key word here is going to be data access. and i really hope that in the enforcement of the d. s. a and also in the, in implementation. so 1st of all, in the design of the infrastructures that are going to have to implemented, there will be a sufficient, sufficient attention paid to the fact that data shared by platforms is not
3:44 am
necessarily reliable. so we have a little bit of a catch $22.00 situation. we have seen this before when facebook tried to share research data, and we've seen that there were mistakes made. and a lot of research was deemed completely inaccurate because of the fact that facebook data simply was not good. as a catalina, it sounds like what you're describing, you know, is rather complicated as far as, you know, setting up procedures in order to be able to implement this. i mean, do you foresee this going smoothly or do you think this is going to get rather messy? and i had very much depends on also the coordination of all of the stakeholders that we have already powers of investigation. so for instance, if you look at the, a national consumer protection authorities, because consumer protection has been a very massive concern. and also for the dsa, you 100 was already mentioning the fact that you can have various types of content . for instance, i call accounting for counter fade goods, or sometimes also considered to be
3:45 am
a product that platforms need to look at and then remove also from the consumer protection perspective, not just the until and the collect your property perspective. but the problem here is that it very much depends on how all of these organizations that have a stake in the implementation of the d. a and in the enforcement of prior regulation will come together. if we take, there has been a lot written about the political compromise around the dsa. and for instance, targeted advertising that that actually is meant to target miners. so who is the minor in which country do we have? which standard? who is responsible? is it advertising as a data protection? a lot of authorities nationally need to come into this web of not only substantial rules, but also enforcement to figure out how exactly to divide or to to coordinate the space of the digital market. because otherwise, there will be a cannibalization of it for us and thomas from your vantage point,
3:46 am
how do you think tech companies are going to deal with these new regulations? do you expect that these new you regulations are going to face a lot of legal challenges by big tech? well, one could suggest that perhaps the death declarations of victory that have surrounded this political agreement might be a bit premature. in the sense that it seems to me at least that there is a great deal that remains to be seen about how the dsa is enforced. how harmful content as you suggest is, is, is defined and, and how these new rules will be implemented by the, the tech companies. i do find that there is quite a lot of vagueness in, in the new proposed loc as it's not yet fully adopted. and, and it's going to be difficult, i believe, for tech companies to know exactly what they have to do and what they may not
3:47 am
do so. so i think there's a great deal of uncertainty about how this is how this actually pens out. it will require a lot of good faith on the part of everybody involved and we'll see whether that good faith, materializes and thomas just to follow up when you're talking about, you know, this great deal of uncertainty that surrounds is, are you talking specifically about the digital markets act of the digital services act or, or both. i'm talking about both, but it does seem to me and i must say that my areas and i trust that the digital markets act is the one that i am much more deeply involved in. but it seems to me that the digital services act for almost by design it seems, incorporates a great deal more uncertainty. the digital market sack will also have a great deal of questions surrounding it about what it means and what the companies who are gatekeepers have to do what they may not do. you want to see how much momentum did the effort to pass this legislation gain from the fact that there is
3:48 am
so much this information out there with regard to the war in ukraine and also that there's so much this information out there when it comes to code. 19 yeah, i think the development digital highlights and they also highlighted with the need to do something and they have influenced a little bit the negotiations in a way that incorporated the crisis mechanism. because one thing is that you have regular, complex lex changes and these animals. but another thing is that you have a situation that you could not be in like an hour or your crane. and to which you need to react a form v as a regular electrons than doing what they are seeing and what they do to mitigate the risk. and, and this is why it is quite as much as most incorporated now to, to allow these kind of talk conversations and you know, operation on these issues. but if i mean at another point, because you mentioned the fema expression, it is precisely there to ensure the freedom of expression because the one thing is
3:49 am
to remove another thing is to one over remove. so this is why, for example, terms of use need to understand what is out if you can understand why content this is true. and they also need to have a means to address in the future to very easily say, i think there's a least reinstated content. so, and i think what you want to do here is to give and this is what it was, i think i also, you know, it means that democracy in the us do not use this 80 do to dis, question only to, to performance, but also to governance framework which governs the way this works on it. so this is very, very wondered if i can make a last point to what thomas, that on the, on, on base it was, or maybe not. you find it out of what this is. roger, now go because it is we future group. i mean, you have nowadays certain number of homes that are not existing that say 10 years ago or 5 years ago or, you know, if you do not know what the next is to 5 to 10 years from now with
3:50 am
a so you need to have laws that are the future proof, the e commerce direct. the image is 20 years old, so you see a long time. so it needs to be a groove and the cover also. you know, on your much on in future thomas, i saw you nodding along to some of what johan is was saying there looked like you want to react. please go ahead. yeah, i agree completely. that precisely defining, for example, what constitutes harmful conduct is really just not possible. because new kinds of harmful con, con, do contra harmful context tent will arise and it needs to be future proof. so i don't disagree with a toll with that. but the fact that the. ready the con, the content of harmful content is so uncertain is going to require a great deal of good faith on every one side in order to make this work. it seems to me, catalina, you know, the articles that the reporting about,
3:51 am
about what's in the d. s. a, a digital services act says that it's going to allow people to be able to choose how content is presented to them. i want to ask you, is it actually going to allow people to, to stop algorithmic profiling. that is set up, you know, to get more engagement, are they actually going to be able to opt out of that? so i think that the, there hasn't even been a lot of public discourse around france. it's targeted advertising and full time, for instance, with an m e p, who really has been speaking about this at length. we what we, what we can already see is that a lot of platforms are already creating parallel recommend, or systems. so for instance, instagram and twitter even take talk, have that right now. there are definitely deaf a different types of, for instance, for you pages or different types of feeds. and then some of them are based on more, perhaps a bit of time or, or data related factors. and some of them are based on,
3:52 am
for instance, the people who you follow in a platform. so from that perspective, you could argue that already some, there are being some, some changes are already being very visible. now the thing is that one of the major problems that i think the d s a perhaps can start to tackle. but it is so complex that, that we still need a lot of other solutions is the fact that the, the data broker market is very or peak. so do we even have a map of really understanding which companies are using the type of data that is being collected also by social media platforms? which types of platforms are marketplaces are not using that home? what is the percentage of the platforms that use that? so all of these questions remain still to be answered because the pace in which the online and the digital economy is developing is incredibly fast. and this is why i completely agree with what you 100 was mentioning,
3:53 am
that we need teacher proof regulation. the question is, how do we draw the line between not having too little content in context and having too much as thomas was mentioning? because otherwise, sometimes we deal with legal uncertainty even when future regulate future proof regulation is there. and the example that comes to mind as the unfair commercial practices direct of that has been and is still a can very nicely tackled dark patterns, which is another type of a user manipulation that we see very much quoted in the. a of public discourse around the d essay. but if you look at the definitions in article 5, frances of the, you cpd and you see tests that are very, very vague, like an unfairness test. and you still rely, need to rely on a judge to interpret that. and if you don't have clear interpretations for a very specific situation, then the market doesn't know what to do with that. and we see this very clearly with dark patterns, catalina, i also want to ask you for a moment about the another law that you inactive. the data privacy law called the
3:54 am
general data protection regulation. you know, there's been a lot of criticism that it has not been enforced vigorously enough. are you concerned that the same thing might happen with the digital services act? and i definitely see that we're moving into an era of digital enforcement, digital mart monitoring, and also market surveillance. done not only by the data protection authorities, whether at european or national level that are designed and, and have the mandates to implement and enforce the g d p r but also with a lot of other agencies and a lot of other authorities. so i think that enforcement is going to be perhaps, that we're going to see more and more enforcement in the future. the, the question that i would like to ask, and also perhaps it for this discussion, but also for other debates is how do we ensure that we have coal here and enforcement across all of these agencies that are stakeholders in the digital market. and that is the $1000000.00 question. you're honest. there are tech
3:55 am
companies who say that these regulations could stifle innovation. what's your reaction to that? so yeah, if i could just quickly come back to the enforcement because it's very different here that you in for state of protection which is enforced. not only the national level but also to regional something in germany. it's the regions that aren't charged offices. what if you're central enforcement and that comes back to your 1st question, the dca will be enforced for the big platforms, but for department and uses by the commission. and it's the same is true for the d . m avenue because people but the gate. so, and that is completely different situation and, and i would also like to add one point of the targeting, guess what did you say? foresee is that you have an option to indeed opt out and say, no, i do not want to be targeted. and there want to recommend a system that is not based on target. now you can of course say it is not very useful. so it was with the choice with the consumer itself was we have to bend for target of us. it wasn't for miners and for, for sensitive data. now, when you question, the point of it is
3:56 am
a is to introduce obligations that converts last. so size matters. if you're a small company and you actually find it easier because you don't have to have all the big debate at the same obligations of the big funds, there's also one aspect, i mean, it is very important exactly that you've lost that innovation, that this is exactly what this, what you say, you may in a way for see that you have the applications for the players. then for small ones in the mail, for example, gifts rights to smaller competitors. for example, use marketplace is to sell the products to have access to that data, which date now what is to have? so and it's, it's exactly yours actually protect the smaller company. at thomas, we have a little less than a minute left. let me just ask you very quickly from your vantage point, is that you have enough manpower to actually be able to monitor and enforce the digital services act and the digital markets act. that was exactly the point i wanted to just raise both of these pieces. legislation together mark
3:57 am
a huge increase in the amount of regulation in the tech tech sphere. and i tend to doubt that there will be sufficient resources deployed by the european commission to be able to effectively enforce. all right, well we have run out of time, so we're going to have to leave the conversation there. thanks so much. all of our guest. johan is market catalina, go on to, and thomas vineyard. and thank you for watching because of the program again, any time visiting our website, al 0 dot com. and for further discussion, go to our facebook page. that's facebook dot com, forward slash ha, inside story. you can also join the conversation on twitter or handle is at ha, inside story. i mean, how much am jerome in the holding here and a half by for now? ah, ah.
3:58 am
as climate change heats up, the planet, one scientist intends to take his back to the ice age to save the permafrost below he's reintroducing animals to the grasslands above starting with the living creatures that planning to resurrect an extinct species. ah, could, is a freight save our wells? witness the zimm of hypothesis on al jazeera. ah, ah,
3:59 am
ah. the climate has changed every year for millions of years, decades of talk, but little action is all about distract, create confusion to crate, smoke and mirrors. the shocking truth about how the climate debate has been systematically refer to the oral industry was a made bank roller or opposition to clock back to campaign against the climate. do you think that's a bad thing more to to and that was, here's a good thing. absolutely. on algebra, we understand the differences and similarities have cultures across the world. so no matter where you call hand al jazeera will bring you the news and current affairs that mattie houses, they are revealing
4:00 am
eco friendly solutions to combat threats to our planet on al jazeera ah, to. ready watching out there with me, the whole rahman indo reminder of our top news stories. manuel macro has been reelected french president for a 2nd 10. his fall right challenger, marine the pen conceded defeat shortly after polls closed. and early estimates were released. once official, the victory would make by crawl the 1st french president in 20 years to secure a 2nd turn. ben smith.
21 Views
Uploaded by TV Archive on