Law Texas social media anti-censorship law goes into effect

What? I'm not taking anything from anyone. Or giving anyone censorship rights. I think your point only works in a world in which any social media site has a monopoly on Internet content.
I'm not sure this is real life or just a troll?

You said, the company has more rights than the individual.

I disagree.

Propagate your bullshit elsewhere
 
The thing that is funny about this is that it would be a trivial task to create a conservative Twitter, Youtube, or Facebook. There might even be a market for it. The reason why that this isn't looked at as a solution because nutty republicans don't want to live in world where libs don't have to see my post.

I think it’s more that the demographic that is concerned about this is over 50 and likely doesn’t know how to switch over to something and with that aging demographic, it isn’t exactly an appealing business model to start a new platform just to get them to cross over.
 
Why should corps be considered public forums just because they are popular? They aren’t allowed to have a business model anymore? How about we create a state run public forum with no censorship and leave successful private businesses to do business.

There isn't any problem with corporations having a business model. There is a problem with individuals using that corporation to exert influence on public perception or to attempt to stifle free speech and open dialogue in an era when such things are largely contained on social media, not in open public areas via "soap box" debates.

If you wish to create extremists stifle the voices of people who already feel marginalized, ignored, or outright shunned. Shut down public debate of their view point. Make them feel unwelcomed in society. They'll find fringe sites that are echo chambers to reinforce that they are right in their beliefs and justified in action.

There's no easy fix to the problem of biased media and control of such media in the hands of a few.

Imagine today's social media stifling knowledge of something like the Tuskegee Syphilis study or that Tobacco Companies know that their product causes cancer because a political party, corporations, or people of a certain ideology use censorship to control information and form narratives.
 
Last edited:
There isn't any problem with corporations having a business model. There is a problem with individuals using that corporation to exert influence on public perception or to attempt to stifle free speech and open dialogue in an era when such things are largely contained on social media, not in open public areas via "soap box" debates.

If you wish to create extremists stifle the voices of people who already feel marginalized, ignored, or outright shunned. Shut down public debate of their view point. Make them feel unwelcomed in society. They'll find fringe sites that are echo chambers to reinforce that they are right in their beliefs and justified in action.

There's no easy fix to the problem of biased media and control of such media in the hands of a few.

Imagine today's social media stifling knowledge of something like the Tuskegee Syphilis study or that Tobacco Companies know that their product cause cancer because a political party, corporations, or people of a certain ideology use censorship to control information and form narratives.
What I see are mostly people claiming bias for being fact checked when they’re straight up lying. Or being banned for violating terms of service. Why should any business cater to people who are only interested in spouting non stop bullshit and turning their site into 4 Chan?
I’m sorry for the few people with a legitimate gripe that end up getting caught in that net, but they should be pissed at the people creating the problem.

The internet is big enough for everyone. That said, if someone wants to pursue a monopoly charge, that’s fair game. If we want to make regulations that provide users on very large sites with a clear path to dispute a decision via arbitration or whatever, I’d probably support something like that. If we want to create a public utility for discourse, I’d support that.
 
I'm not sure this is real life or just a troll?

You said, the company has more rights than the individual.

I disagree.

Propagate your bullshit elsewhere

Wait, where did I say the company has more rights than the individual? You just made that up, no? And you're accusing me of trolling? Try to address points without lying.
 
What I see are mostly people claiming bias for being fact checked when they’re straight up lying. Or being banned for violating terms of service. Why should any business cater to people who are only interested in spouting non stop bullshit and turning their site into 4 Chan?
I’m sorry for the few people with a legitimate gripe that end up getting caught in that net, but they should be pissed at the people creating the problem.

The internet is big enough for everyone. That said, if someone wants to pursue a monopoly charge, that’s fair game. If we want to make regulations that provide users on very large sites with a clear path to dispute a decision via arbitration or whatever, I’d probably support something like that. If we want to create a public utility for discourse, I’d support that.

I think it’s happened to some of the country but there really need to be a reset in looking at how much rope Trump was given during that timeframe. For it to be controversial to ban a departing president who was actively trying to delay the transfer of power various ways (key one taunting his VP to not accept state votes) is crazy. Like what option did they really have there? That wasn’t a good time to think “well let’s let the battle of ideas sort this out”. It was just blatantly Un-American. His ban completely made sense and actually shows just how much leeway these sites give to highly public officials to say what they want.
 
What I see are mostly people claiming bias for being fact checked when they’re straight up lying. Or being banned for violating terms of service. Why should any business cater to people who are only interested in spouting non stop bullshit and turning their site into 4 Chan?
I’m sorry for the few people with a legitimate gripe that end up getting caught in that net, but they should be pissed at the people creating the problem.

The internet is big enough for everyone. That said, if someone wants to pursue a monopoly charge, that’s fair game. If we want to make regulations that provide users on very large sites with a clear path to dispute a decision via arbitration or whatever, I’d probably support something like that. If we want to create a public utility for discourse, I’d support that.

I can't say that I've noticed "straight up lying" being the majority of the problem for those who are banned, shadow-banned, or de-monetized but can say that violating terms of service is what I most often see regarding bans or removal of content. The problem with that is who gets to say what content is in violation of the terms of service?

For example, I can make a video stating the belief that a man can't simply become a woman. Someone can view this as incendiary and demeaning or as hate speech towards Trans and flag my content to have it removed.

Because of the ability to censor opposing views and delete content with impunity I think your second paragraph has ideas worth talking about .
 
Wait, where did I say the company has more rights than the individual? You just made that up, no? And you're accusing me of trolling? Try to address points without lying.
Your first post:

""Free speech" in this case means the exact opposite of free speech. It's forcing people to host comments they don't want to host."

So the companies that host supposedly free public speech can use the framework of the first amendment to censor the individual and look like they are advocating the freedom to speak one's mind. That's dumb.
 
Your first post:

""Free speech" in this case means the exact opposite of free speech. It's forcing people to host comments they don't want to host."

You: "You said, the company has more rights than the individual."
Me: "Wait, where did I say the company has more rights than the individual?"
You: "Free speech" in this case means the exact opposite of free speech. It's forcing people to host comments they don't want to host."

??? I'll ask again: Where did I say the company has more rights than the individual? Don't quote me saying something that isn't that the company has more rights than the individual. Quote me saying what you said I did or at least have the decency to admit you were lying because you didn't have a good argument for your position.

So the companies that host supposedly free public speech can use the framework of the first amendment to censor the individual and look like they are advocating the freedom to speak one's mind. That's dumb.

No, I think everyone should be protected by the First Amendment--including people who you (and I) disagree with.
 
Forces private media companies to dedicate the platforms they built to host content they do not wish to.

This is like arguing Fox News now needs to open up its TV segments to any political content people wish to air because they are popular.

Not really. Fox News can be held accountable for what is on their channel. These social media companies have always argued, and continue to argue, that they can not be held responsible for what is on their platform. If they will accept responsibility for content then they can also edit and choose what is on it. If they continue to be given the special legal protection of immunity for what is on their platform then they must publish everything. They can choose either path.
 
"Free speech" in this case means the exact opposite of free speech. It's forcing people to host comments they don't want to host.


No, it is forcing people who have special legal immunity from prosecution based on content on their platform to host comments they don't like. If social media platforms waive the special legal protection which was created only for them, then they can edit and curate all they want.
 
I can't say that I've noticed "straight up lying" being the majority of the problem for those who are banned, shadow-banned, or de-monetized
I'm sure you're aware there's a very large and real effort to spread disinformation. Reasonable people can disagree on how to address that problem, but simply fact checking or rebutting obvious falsehoods has sent people into a tizzy over unfair censorship. See the old Donald Trump twitter account for many, many examples.

but can say that violating terms of service is what I most often see regarding bans or removal of content. The problem with that is who gets to say what content is in violation of the terms of service?

For example, I can make a video stating the belief that a man can't simply become a woman. Someone can view this as incendiary and demeaning or as hate speech towards Trans and flag my content to have it removed.
Has this happened on one of the big platforms? I'd disagree with a ban for that unless it was part of an obvious attempt to antagonize or demean or offend. In which case, I'm fine with the businesses's discretion. I see no reason why the state should get to decide for them if that kind of thing is allowed on their site. No more than I'd want them to tell a baker to bake a cake he finds offensive. As a rule, government should stay as minimally involved with policing free speech as possible. I think we also have to accept that mistakes will always be made. Mistakes in choosing moderators who turn out to be biased, misinterpreting intent, system design, algorithms, etc.

Because of the ability to censor opposing views and delete content with impunity I think your second paragraph has ideas worth talking about .
As a way to detect when larger patterns emerge and to enforce systemic controls (left to the responsibility of the business, like detecting patterns of suicidal/homicidal content and squashing or reporting them, for example), or to arbitrate fair grievances between parties, or to prevent monopolies, I'm open to some greater level of government oversight. But a state with obvious partisan lean deciding what is or isn't "fair" politically and fining or suing companies? That's a big no from me, dawg. Imagine the pandora's box of state laws if that starts happening.
 
So like news was before Reagan did away with the FCC fairness doctrine which forced media companies to give both sides of the story airtime and air opposing views fairly?

Sounds good to me

The “both sides” approach isn’t going to work in times like these. There are “sides” expecting companies to dedicate a portion of their platform to claims a mass shooting at a kid’s school is actually crisis actors carrying out the orders of a satanic-pedo cult/world controlling evil cabal.

I recall during the counting of votes conservatives rifling through at least a dozen different conspiracies in matter of days like they were going through a dumpster for food. First it was ballot stuffing videos till it was pointed out that was an old video from Russia, then a fake ballot burning video started going around, then armies of dead voters, then pivot to voting machines being hacked, no now they are being kidnapped, on and on.

No one could keep up with all the “sides” that feel entitled to attention and a platform nowadays.
 
And just like that, damn near every person responsible for enforcing war room rules are explaining how to exploit the rules related to immigration.
 
No, it is forcing people who have special legal immunity from prosecution based on content on their platform to host comments they don't like. If social media platforms waive the special legal protection which was created only for them, then they can edit and curate all they want.

No one has special legal immunity from prosecution, though. Lots of misinformation out there, but what you're thinking of is something very different. One interpretation of moderation of comments sections (and note that this applies to everyone--it's not like some companies are treated differently) is that everything you don't take out, you basically put in. That interpretation is ridiculous and would prevent any moderation on any comments section from any site (or more practically would prevent all but the biggest companies from even having comment sections), but just because it's ridiculous doesn't mean some lawyer somewhere wouldn't try to profit off making it and filing a frivolous lawsuit. So there's a rule in place shutting that down. In no way does that mean that everyone who has a website then gives up their First Amendment rights. Furthermore, it's insane that the argument that people shouldn't have freedom of speech if they're protected from frivolous lawsuits is a *pro*-First Amendment position. The argument would go better if you guys would just openly admit that you think that only people who agree with you should have First Amendment protections.
 
Not really. Fox News can be held accountable for what is on their channel. These social media companies have always argued, and continue to argue, that they can not be held responsible for what is on their platform. If they will accept responsibility for content then they can also edit and choose what is on it. If they continue to be given the special legal protection of immunity for what is on their platform then they must publish everything. They can choose either path.

Fox is accountable for what their employees say while working for them, just like Twitter. They're not liable for what randos say in the comments section on their site, just like Twitter.
 
You: "You said, the company has more rights than the individual."
Me: "Wait, where did I say the company has more rights than the individual?"
You: "Free speech" in this case means the exact opposite of free speech. It's forcing people to host comments they don't want to host."

??? I'll ask again: Where did I say the company has more rights than the individual? Don't quote me saying something that isn't that the company has more rights than the individual. Quote me saying what you said I did or at least have the decency to admit you were lying because you didn't have a good argument for your position.



No, I think everyone should be protected by the First Amendment--including people who you (and I) disagree with.
What did your first post mean?
 
No one has special legal immunity from prosecution, though. Lots of misinformation out there, but what you're thinking of is something very different. One interpretation of moderation of comments sections (and note that this applies to everyone--it's not like some companies are treated differently) is that everything you don't take out, you basically put in. That interpretation is ridiculous and would prevent any moderation on any comments section from any site (or more practically would prevent all but the biggest companies from even having comment sections), but just because it's ridiculous doesn't mean some lawyer somewhere wouldn't try to profit off making it and filing a frivolous lawsuit. So there's a rule in place shutting that down. In no way does that mean that everyone who has a website then gives up their First Amendment rights. Furthermore, it's insane that the argument that people shouldn't have freedom of speech if they're protected from frivolous lawsuits is a *pro*-First Amendment position. The argument would go better if you guys would just openly admit that you think that only people who agree with you should have First Amendment protections.


I'm wildly pro free speech. That said, Section 230 (I assume we are both talking bout that) has been exploited by the Facebook and others to protect themselves. S230 was meant to protect them by saying that they can moderate but aren't responsible for what is on their platform. The issue is that FB and everybody else have done a piss poor job holding up their end of the bargain - namely keeping harmful stuff their platform because it will take money and effort to do so and will alienate potential users. Instead, the platforms are hiding behind the shield they were given but not actually doing anything meaningful to moderate. Heck, Facebook struggles to keep keep livestreamed mass murders of its platform.
 
Two pieces there:
1. Doing that just gets the same result. A company which opts for their own terms of service won’t be able to moderate content effectively enough to avoid falling into the liability of what some person posts so the only companies remaining will be the “constitutional” ones you are asking for (which may I remind people, there already are companies that claim to do this like Gab, no one uses them however because of the user base that goes there and says things that might be constitutional but aren’t the most fun to have to view if you are just average joe trying to browse content)
2. That other bass pro shop thing is actually what’s usually propped up by the left with social media which I’m also skeptical of and wouldn’t trust them to modify section 230. You get the narrative that these companies wield too much power for say out elections or misinformation in general that they need to be regulated. I don’t think it would work out well and is an attempt to have government police what they want online more. I don’t like that any much better. The only thing I think I could agree to with that analogy is basic precautions like we should try to protect kids who use the internet from seeing adult content but even that has to have most of the responsibility with parents who hopefully are more mindful of that with each generation.

Yea we can go back and forth on the details and what not, that's not my primary concern. My main concern is how much power and influence they have over our political system and our society in general. It was just reported that Venmo is colluding with the Canadian government to freeze accounts. It's been reported in the past that social media companies have colluded with government agents to suppress news stories. There have been campaigns to ban people based on lies to silence them. How do we fix it, fuck if I know, but something needs to be done, and having a conversation about it is at least a good starting point.

And to the highlighted part, Twitter is actively allowing trans activist to target kids to sell hormones and puberty blockers without a doctor or parental consent and banning people who are calling it out. So yall can keep acting like, oh righties hate freedom, when it really has nothing to do with that.
 
What did your first post mean?

The comment was this: '''Free speech' in this case means the exact opposite of free speech. It's forcing people to host comments they don't want to host."

I was talking about people who think that "free speech" means that the gov't forces you to host comments you don't want to host, and in a way that implies I disagree.

I'm wildly pro free speech. That said, Section 230 (I assume we are both talking bout that) has been exploited by the Facebook and others to protect themselves. S230 was meant to protect them by saying that they can moderate but aren't responsible for what is on their platform. The issue is that FB and everybody else have done a piss poor job holding up their end of the bargain - namely keeping harmful stuff their platform because it will take money and effort to do so and will alienate potential users. Instead, the platforms are hiding behind the shield they were given but not actually doing anything meaningful to moderate. Heck, Facebook struggles to keep keep livestreamed mass murders of its platform.

It was meant to make clear that moderating a little doesn't mean that they are liable for anything that slips past. I don't see it as a bargain because I believe in freedom of speech as a principle (i.e., I don't see it as a gift that the gov't grants you as long as you behave). I agree that they can do a better job of moderating, but I think the appropriate response to that is to complain about it to them or publicly, or to not use their product. Calling on the gov't to force them to change the site is an attack on freedom of speech.
 
Back
Top