I bet it was something like the hardware id instead but she misspoke
I bet it was something like the hardware id instead but she misspoke
I know that’s how it works in the US, but the lawsuit is in Japan, which you always hear about having stricter copyright laws. Not really sure how this one will play out though.
I wonder if part of the reason for supporting this is that they like the secondary effect that all this information is now also available to governments
The profit they get from the sale of the television should be enough that they don’t have to make the television shit to get slightly more profit, why do people even buy these
But televisions cost hundreds of dollars at least
“each new connected TV platform user generates around $5 per quarter in data and advertising revenue.”
Sounds like a pathetic amount of money for betraying your customers with a shitty ad infested smart tv
it may be moral in some extreme examples
Are they extreme? Is bad censorship genuinely rare?
but there are means of doing that completely removed from the scope of microblogging on a corporate behemoth’s web platform. For example, there is an international organization who’s sole purpose is perusing human rights violations.
I think it’s relevant that tech platforms, and software more generally, has a sort of reach and influence that international organizations do not, especially when it comes to the flow of information. What is the limit you’re suggesting here on what may be done to oppose harmful censorship? That it be legitimized by some official consensus? That a “right to censor” exist and be enforced but be subject to some form of formalized regulation? That would exempt any tyranny of the most influential states.
I’m going to challenge your assertion that you’re not talking about
You can interpret my words how you want and I can’t stop you willfully misinterpreting me, but I am telling you explicitly about what I am saying and what I am not saying because I have something specific I want to communicate. When you argue that
I believe each country should get to have a say in what is permissible, and content deemed unacceptable should be blockable by region
In the given context, you are asserting that states have an apparently unconditional moral right to censor, and that this right means third parties have a duty to go along with it and not interfere. I think this is wrong as a general principle, independent of the specific example of Twitter vs Brazil. If the censorship is wrong, then it is ok to fight it.
Now you can argue that some censorship may be harmful because of its impact on society, such as the removal of books from school hampering fair and complete education or banning research texts that expose inconvenient truths.
Ok, but the question is, what can be done about it? Say a country is doing that. A web service defies that government by providing downloads of those books to its citizens. Are they morally bound to not do that? Should international regulations prevent what they are doing? I think no, it is ok and good to do, if the censorship is harmful.
Since my argument isn’t about what should be censored, I’m intentionally leaving the boundaries of “harmful censorship” open to interpretation, save the assertion that it exists and is widely practiced.
I also think that any service (twitter) refusing to abide by the laws of a country (Brazil) has no place in that country.
That could be true in a literal sense (the country successfully bans the use of the service), or not (the country isn’t willing or able to prevent its use). Morally though, I’d say you have a place wherever people need your help, whether or not their government wants them to be helped.
If a government is imposing harmful censorship I think supporting resistance of that censorship is the right thing to do. A company that isn’t located in that country, ethically shouldn’t be complying with such orders. Make them burn political capital taking extreme and implausible measures.
tbf the article only assumes he told them no because of how implausible it seems the task would be, the actual details of what if anything was discussed and what happened are unknown.
Implying that it was worse and has gotten better, or will get better to the point where data hoarding is unnecessary. I guess it would be nice if things turned out that well.
Privacy means personal agency and freedom from people, whether individuals, companies, or the government, controlling you with direct or implied threats, or more subtle manipulation, which they can do because they have your dox and because information is power.
A lack of privacy adds fuel to the polycrisis because if we can’t act in relative secrecy that basically means we can’t act freely at all, and nothing can challenge whoever runs the panopticon.
It’s important though because if that’s the real reason Google pays them, they could come up with some other excuse to give them the money.
This one can do that stuff: https://github.com/huchenlei/ComfyUI-layerdiffuse?tab=readme-ov-file
I did all my transportation and shopping with a mountain bike for a year and it’s kind of difficult on snow and ice, fell over some. The trick is to never turn at all when on that stuff, but it’s still hard. The cold makes the oil for the mechanisms work worse too, you need special oil. My hands got very cold holding on to the handlebars, you need to find some balance between gloves that hold warmth and resist the wind and gloves that let you have enough dexterity for the brakes and shifters.
So much awful stuff that both sides seem to be able to agree on
I don’t think that person is bragging, just saying why it’s useful to them
Anti-money laundering provisions in the EU have been adjusted several times though
Adjusted to give more leeway? Can you cite a source on this happening
It’s not actually clear that it only affects huge companies. Much of open source AI today is done by working with models that have been released for free by large companies, and the concern was that the requirements in the bill would deter them from continuing to do this. Especially the “kill switch” requirement made it seem like the people behind the bill were either oblivious to this state of affairs or intentionally wanting to force companies to stop releasing the model weights and only offer centralized services like what OpenAI is doing.