I primarily use two online image hosting services to host my Photo Gallery – 500px and Flickr. I like both services and find that they appeal to slightly different groups of users.
I had actually gone off Flickr for a while, but decided to give it another try (and recommended others do so too) when they did 2 things – introduced a great update to their iPhone app and offered a three month trial/extension to Flickr Pro.
What drew me to 500px was their fantastic universal iOS app, along with hosting of hi-res images and the ability to have galleries at your own domain – mine are at photos.desparoz.com.
I love the fact that people can view my images displayed gorgeously on the web, or on an iPhone, iPad or Android device.
So I was disturbed today to read that Apple has withdrawn the 500px app from the App Store, over potential access to nudity. According to PetaPixel:
If you were planning to install 500px’s popular photo sharing app on your iPhone or iPad today, you’re out of luck. The app was abruptly yanked from the iTunes App Store earlier today over the fact that users can search for photos showing artistic nudity.
This is especially concerning when, as reported by Cult of Mac, it is actually more difficult to access nudity on the 500px app than it is on other popular apps, including Flickr.
What’s interesting about this to me is that 500px’s method of keeping minors from seeing nude images in their official iOS app is a lot more prohibitive than that employed by Flickr, a similar photo-sharing app.
In the 500px app, safe browsing is the default, and you have to change this through the website, not the app. In Flickr, the app allows you to disable the safe browsing lock.
I have two fundamental concerns over this development.
Firstly, why has Apple suddenly taken this action, considering the 500px has been in the App Store since October 2011. It seems to be a unilateral action, especially considering 500px had commited to making and submitting immediate changes.
It also sets a dangerous precedent, and we have to wonder whether apps like those by Flickr and Tumblr will also be yanked.
Secondly, as a committed user of Apple technology, I rely on Apple products for a great user experience – both from the hardware and the software it provides. With relation to the App Store, Apple gives me the promise that it will approve apps that meet basic guidelines on security. I’m ok with iOS sandboxing, as a rule, because my phone needs to work.
But, I do not need Apple to act as a censor, making an unilateral judgement about what content I can and cannot see. Certainly, explicit pornography (such as what might be rated as R 18+ or X 18+ under the Australian censorship system) can justifiably be restricted, but the human body is a wonder of nature, and can be very artistic.
Given that “nudity of moderate impact” can be included in material rated M (Mature) in Australia, I think that Apple should allow any app to have material that goes up to at least this level (or maybe MA 15+, as long as there are appropriate safe browsing modes enabled.
Perhaps what Apple should do is bake in safe mode tools into iOS, and allow individual apps to access these settings. If a safe mode is turned on for the device, individual apps should respect those modes. If they’re turned off, then material up to M or perhaps MA 15+ classification should be allowable.
Apple needs to provide a balance in iOS. Certainly, most users don’t want or need “Wild West” access to every aspect of a system that is part of an important communications device, but at the same time, Apple should not act as a censor. Censorship is a tricky subject, and often leads to a slippery slope. It is the role of democratic government to make considered decisions on behalf of the people, not of companies like Apple to make decisions unilaterally.