Close Menu
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
Trending

developers outline plan to protect network from quantum threats

1 minute ago

BitMine Expands ETH Holdings Despite $6.5B in Unrealized Losses

2 minutes ago

Taylor Swift Seeks Trademarks for Her Voice and Image to Fight AI Fakes

5 minutes ago
Facebook X (Twitter) Instagram
Facebook X (Twitter) Discord Telegram
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Market Data Newsletter
Monday, April 27
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Home»AI & Censorship»The Internet Still Works: SmugMug Powers Online Photography
AI & Censorship

The Internet Still Works: SmugMug Powers Online Photography

News RoomBy News Room2 hours agoNo Comments7 Mins Read716 Views
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram
The Internet Still Works: SmugMug Powers Online Photography
Share
Facebook Twitter Pinterest Email Copy Link

Listen to the article

0:00
0:00

Key Takeaways

Playback Speed

Select a Voice

SmugMug is a family-owned photo hosting and e-commerce platform that helps professional photographers run their businesses online. Founded in 2002, the company provides tools for photographers to show their work, deliver client galleries, sell prints, and manage payments. 

In 2018, SmugMug purchased Flickr, the long-running photo-sharing community, which added tens of millions of active hobbyist photographers to the company’s user base. 

Ben MacAskill is President and COO of SmugMug’s parent company, Awesome, which he co-founded with his family. Awesome also includes the media network This Week in Photo and the nonprofit Flickr Foundation, which focuses on preserving publicly available photography. MacAskill has been an active voice in policy discussions around Section 230 and online platform regulation. He was interviewed by Joe Mullin, a policy analyst on EFF’s Activism Team.

Joe Mullin:  How would you explain Section 230 to a SmugMug photographer who hasn’t heard of it but relies on you to share their work, run their business.

Ben MacAskill: Section 230 allows us to run our business. We are a small, family run business. We don’t have the resources to police every single upload, every single comment, or every single engagement that happens on the site. 

That includes photographers who have comments on their sites. Anywhere there’s interaction online, Section 230 protects us. 

It doesn’t absolve us of liability. We can’t run rampant and do anything we want. It  just helps protect us and make it scalable so that we can run our business.

What would you have to change if Section 230 were eliminated or significantly narrowed? 

Honestly, there’s a high chance that it would bankrupt platforms like ours. They’re not wildly profitable. If Section 230 is done away with, we have to [check] content that goes online to make sure we’re not liable. That means policing tens of millions of uploads per day. 

That would kill the business of a lot of photographers. Can you imagine—you just got married, and you’re waiting for your wedding photos for a week or two because they’re in some moderation queue? 

If we don’t have legal protections, and we get one nefarious customer—if something goes sideways—then I’m liable for that. 

I don’t, and can’t possibly know, whether every single photo is appropriate or legal, as it’s uploaded. We would literally have to moderate everything before it goes online. I don’t think any business can afford that, period. I guess you could have an offshore call-center type thing. Still, it would change the entire nature of the real-time internet. Imagine posting something to Instagram and having the platform say, “Cool, we’ll get back to you in 8 to 12 days.” 

What kind of content moderation do you do on SmugMug? 

If a user uploads something illegal, we will report them as soon as we find it. We’re not protecting them. We don’t condone or allow illegal behavior. We work very closely with organizations, nonprofits and governmental agencies to detect CSAM—child exploitative material—and we report that to the National Center for Missing and Exploited Children. We will report users, we eliminate illegal content on our platforms—which is one reason we have such a low prevalence of that problem. 

But that does take effort and time to find, and there is currently no perfect solution. The tech solutions that exist can’t detect it at 100% accuracy, or anywhere close. And with tens of millions of uploads a day, going through them one by one is impossible. 

How do you think more generally about protecting user speech and creative expression? 

On SmugMug, we’re really focusing on professionals running their business. So we don’t have to [weigh in] on content too much. 

On Flickr, we are big proponents of expression and artistic creativity. Photographers have opinions! But we do draw the line at things like hate speech and harassment. We aggressively maintain a friendly platform. Our community guidelines are very specific, that you cannot harass other customers, you cannot upload stuff classified as hate speech, or threats, or anything along those lines. 

Those rules are generally policed by the community. We do have some text analysis tools, but when community members feel harassed or threatened, reports will come in. We’ll address them on a one-by-one basis and remove harassing material from our platform. 

Our ability to moderate is one of the things that makes Flickr what it is. If we lose the ability to enforce our own moderation rules—or have that legislated for us—then it changes the entire nature of the community. And not in a good way. Losing the ability to moderate would permanently and forever change what we’ve built.

What kind of complaints or takedown requests do you receive, and how do you handle it, both in the U.S. and abroad? 

Flickr is often referred to as the friendliest community online. You know, we’re not dealing with a lot of hate. We’re not dealing with a lot of threats. Under other frameworks, like the DMCA, we do takedowns on copyrighted material. 

We’re able to handle it with a fully internal team, and we have a great track record. But the user base and the content base is so large that, if we had to assume that those tens of millions of uploads a day are problematic, the burden would be extreme. 

We have a robust Trust and Safety Team, and we operate in every non-embargoed country on Earth. So we are subject to a lot of different laws and regulations: “likeness” rules and privacy rules in certain countries that don’t exist here in the United States. Even state to state, there’s some varying laws. It’s a complicated framework, but we pay attention to it. 

The globe responds in much the same way that Section 230 is working. That is, we operate on reports and discovery, not on pre-screening everything. 

What do you think that policy makers most often misunderstand about how platforms like yours operate?

One misconception is that we are not beholden to any laws. That Section 230 absolves us of any responsibility and any liability, and we can just do whatever we want. They talk about it as “reining in tech companies,” or “holding tech companies accountable.” But I am accountable for the content on my platform. We’re not given this “get out of jail free” card. 

And I think they assume all platforms don’t really care about this, that anything that is done is done begrudgingly. But we’re very proactive about keeping a clean, polite, and friendly community. We are already very aggressively policing our platform. 

And even legal content gets moderated, because it might just not be appropriate for a particular community. 

We enforce our rules, and much the way that other private in-person businesses will enforce their rules. If you start screaming hateful things at patrons in a coffee shop, they’re going to throw you out. They want a quiet, chill vibe where people can sip their lattes. We’re doing the same sort of things. 

As an independent family owned company you’re in an ecosystem dominated by much larger platforms. How are these issues different for you as a smaller service? 

I think it’s a much more existential threat for middle and small tech companies. It also shuts off the next generation of these platforms. The computer science student in a dorm room right now won’t have the legal protections to launch, to even try to build something new. At least not here in the United States. 

Read the full article here

Fact Checker

Verify the accuracy of this article using AI-powered analysis and real-time sources.

Get Your Fact Check Report

Enter your email to receive detailed fact-checking analysis

5 free reports remaining

Continue with Full Access

You've used your 5 free reports. Sign up for unlimited access!

Already have an account? Sign in here

#ContentModeration #Deplatforming #DigitalCensorship #FreeExpression #PlatformAccountability #ShadowBanning
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link
News Room
  • Website
  • Facebook
  • X (Twitter)
  • Instagram
  • LinkedIn

The FSNN News Room is the voice of our in-house journalists, editors, and researchers. We deliver timely, unbiased reporting at the crossroads of finance, cryptocurrency, and global politics, providing clear, fact-driven analysis free from agendas.

Related Articles

Media & Culture

Daily Deal: MasterBundle For Web Designers

31 minutes ago
Media & Culture

Judge Just Noticed The Obvious Problem With Trump Suing His Own IRS For $10 Billion

2 hours ago
Media & Culture

Tennessee’s ‘Charlie Kirk’ Act Would Force Public Universities To Be As Hypocritical As MAGA’s Favorite Dead Boy

4 hours ago
Media & Culture

California’s 3D Printer Law Would Criminalize Open Source, Enshittify The 3D Printing Space

8 hours ago
Media & Culture

Game Jam Winner Spotlight: I Could Do That!

2 days ago
Media & Culture

RFK Jr. & White House Appear At Odds Over Attempts To Rein Him In

3 days ago
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

BitMine Expands ETH Holdings Despite $6.5B in Unrealized Losses

2 minutes ago

Taylor Swift Seeks Trademarks for Her Voice and Image to Fight AI Fakes

5 minutes ago

Daily Deal: MasterBundle For Web Designers

31 minutes ago

Advocates for Asian Massage Workers Decry ‘Sexist, Racist’ Raids in Seattle

33 minutes ago
Latest Posts

BTC drops below $77,000 as rising oil and Iran risks stall the rally

1 hour ago

Bitcoin Bulls Battle For Control With Emphasis On $80K Reclaim

1 hour ago

Malicious Web Pages Are Hijacking AI Agents, And Some Are Going After Your PayPal

1 hour ago

Subscribe to News

Get the latest news and updates directly to your inbox.

At FSNN – Free Speech News Network, we deliver unfiltered reporting and in-depth analysis on the stories that matter most. From breaking headlines to global perspectives, our mission is to keep you informed, empowered, and connected.

FSNN.net is owned and operated by GlobalBoost Media
, an independent media organization dedicated to advancing transparency, free expression, and factual journalism across the digital landscape.

Facebook X (Twitter) Discord Telegram
Latest News

developers outline plan to protect network from quantum threats

1 minute ago

BitMine Expands ETH Holdings Despite $6.5B in Unrealized Losses

2 minutes ago

Taylor Swift Seeks Trademarks for Her Voice and Image to Fight AI Fakes

5 minutes ago

Subscribe to Updates

Get the latest news and updates directly to your inbox.

© 2026 GlobalBoost Media. All Rights Reserved.
  • Privacy Policy
  • Terms of Service
  • Our Authors
  • Contact

Type above and press Enter to search. Press Esc to cancel.

🍪

Cookies

We and our selected partners wish to use cookies to collect information about you for functional purposes and statistical marketing. You may not give us your consent for certain purposes by selecting an option and you can withdraw your consent at any time via the cookie icon.

Cookie Preferences

Manage Cookies

Cookies are small text that can be used by websites to make the user experience more efficient. The law states that we may store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses various types of cookies. Some cookies are placed by third party services that appear on our pages.

Your permission applies to the following domains:

  • https://fsnn.net
Necessary
Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.
Statistic
Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.
Preferences
Preference cookies enable a website to remember information that changes the way the website behaves or looks, like your preferred language or the region that you are in.
Marketing
Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.