Browsed by
Tag: censorship

500px Apps Back in the App Store

500px Apps Back in the App Store

After the 500px app was pulled from the App Store last week, I was glad to see it this morning in my updates list.

500px app updating in the iOS App Store
500px app updating in the iOS App Store

Given the few comments from 500px, clearly they have been heads down to get issues resolved to satisfaction. According to GigaOm’s Erica Ogg there were three things required by Apple for 500px to get their apps back in the App Store:

500px has been updated with three fixes requested by Apple’s app reviewers, including a tweak that will prevent queries for explicit image searches from producing results, adding a function for users to report inappropriate content, and the addition of a 17+ age rating on the app.

After my previous post, an Apple spokesperson made a statement which was reported in TechCrunch and others:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines.

I have used 500px for sometime, and although I have seen artistic nudity, I have never seen anything approaching pornography. I wouldn’t want my images associated with any service that knowingly hosted pornography, and would likely drop my membership.

I wonder how long Apple had been talking to 500px prior to pulling the app. On the surface, it appears that double standards may apply, as reported by the SMH:

Apple allows apps from such big-name social platforms as Twitter, Vine, Tumblr and Pinterest to remain in its App Store even though they contain adult content. Yet it has knocked off other lesser-known sites from its store because of “pornographic images and material”.

I don’t believe any other company should enforce censorship standards that are over and above those of the western world.

500px App Censored from App Store

500px App Censored from App Store

Des Paroz Gallery on 500px
I primarily use two online image hosting services to host my Photo Gallery – 500px and Flickr. I like both services and find that they appeal to slightly different groups of users.

I had actually gone off Flickr for a while, but decided to give it another try (and recommended others do so too) when they did 2 things – introduced a great update to their iPhone app and offered a three month trial/extension to Flickr Pro.

What drew me to 500px was their fantastic universal iOS app, along with hosting of hi-res images and the ability to have galleries at your own domain – mine are at photos.desparoz.com.

I love the fact that people can view my images displayed gorgeously on the web, or on an iPhone, iPad or Android device.

So I was disturbed today to read that Apple has withdrawn the 500px app from the App Store, over potential access to nudity. According to PetaPixel:

If you were planning to install 500px’s popular photo sharing app on your iPhone or iPad today, you’re out of luck. The app was abruptly yanked from the iTunes App Store earlier today over the fact that users can search for photos showing artistic nudity.

This is especially concerning when, as reported by Cult of Mac, it is actually more difficult to access nudity on the 500px app than it is on other popular apps, including Flickr.

What’s interesting about this to me is that 500px’s method of keeping minors from seeing nude images in their official iOS app is a lot more prohibitive than that employed by Flickr, a similar photo-sharing app.

In the 500px app, safe browsing is the default, and you have to change this through the website, not the app. In Flickr, the app allows you to disable the safe browsing lock.

I have two fundamental concerns over this development.

Firstly, why has Apple suddenly taken this action, considering the 500px has been in the App Store since October 2011. It seems to be a unilateral action, especially considering 500px had commited to making and submitting immediate changes.

It also sets a dangerous precedent, and we have to wonder whether apps like those by Flickr and Tumblr will also be yanked.

Secondly, as a committed user of Apple technology, I rely on Apple products for a great user experience – both from the hardware and the software it provides. With relation to the App Store, Apple gives me the promise that it will approve apps that meet basic guidelines on security. I’m ok with iOS sandboxing, as a rule, because my phone needs to work.

But, I do not need Apple to act as a censor, making an unilateral judgement about what content I can and cannot see. Certainly, explicit pornography (such as what might be rated as R 18+ or X 18+ under the Australian censorship system) can justifiably be restricted, but the human body is a wonder of nature, and can be very artistic.

Given that “nudity of moderate impact” can be included in material rated M (Mature) in Australia, I think that Apple should allow any app to have material that goes up to at least this level (or maybe MA 15+, as long as there are appropriate safe browsing modes enabled.

Perhaps what Apple should do is bake in safe mode tools into iOS, and allow individual apps to access these settings. If a safe mode is turned on for the device, individual apps should respect those modes. If they’re turned off, then material up to M or perhaps MA 15+ classification should be allowable.

Apple needs to provide a balance in iOS. Certainly, most users don’t want or need “Wild West” access to every aspect of a system that is part of an important communications device, but at the same time, Apple should not act as a censor. Censorship is a tricky subject, and often leads to a slippery slope. It is the role of democratic government to make considered decisions on behalf of the people, not of companies like Apple to make decisions unilaterally.