Blog | Apr. 28, 2016
In the chapter Simulacra and Simulations of his Selected Writings, Jean Beaudrillard offers an analysis of simulacra — a representational image without an original — which I believe can be applied to icons. In particular, Microsoft Word’s stupid save icon:
As Beaudrillard explains:
These would be the four successive phases of the image:
- It is the reflection of a basic reality.
- It masks and perverts a basic reality.
- It masks the absence of a basic reality.
- It bears no relation to any reality whatever: it is its own pure simulacrum.
Using Beaudrillard’s explanation, let’s describe the life cycle of the save icon in MS Word:
- Clicking this icon indicates you are saving your file to a 3.5″ floppy diskette.
- You are saving a file, but you’re more likely saving to an internal hard disk, and not a floppy.
- Floppy disks are supplanted by Zip disks, until removable media’s decline around the end of the century. No one uses floppy disks.
- Files are saved to hard drives, stored in the cloud, or on Dropbox. The only removable media in regular use is the USB stick. The floppy disk icon remains a symbol only of itself.
And this is how Word’s save icon became a simulacra — to represent nothing.
Blog | Jan. 7, 2015
Here’s how I took a Mac SE/30, a 30-year-old impact printer, a disused VCR and some extraordinary effort to create my MFA directing application, and how nearly everything went wrong.
Read more »
Blog | Jan. 25, 2014
I’m not saying this blog, but the concept is on its way out.*
Just as Facebook will go to its long goodnight as its users go to other sites, some leaving because they just want to make pithy comments passing other peoples’ content around, share photos of their lunch, or turn the idea of a newsletter inside-out, there are better sites for all these things than one’s own blog, and each of them comes with a strength the independent blog does not: a built-in community. Unless you’re one of my five friends, or came across this site looking for an (increasingly outdated) way to customize the RSS output of a Drupal feed, then odds are you aren’t even reading this.
I’m the man in the high castle, the fool on the hill, who built a fortress around himself, and wondered why no one ever visited.
But don’t worry… as per my habit, I’m behind the times — I haven’t even listened to ArtPop yet — so I figure timtoon.com still has a few years left in it.
*Todd would say this sentiment itself is outmoded.
Blog | Dec. 17, 2013
Rumors persist about the Apple TV set, this time putting the release date at sometime in 2015.
How about half-past never?
Blog | Jul. 29, 2012
Blog | Apr. 2, 2012
I got a curious email from Jen Rhee, but thankfully Tim Dobson has been there and pointed out that this email is linkbait.
Just mentioning this in case anyone else gets an email out of the blue from Jen Rhee.
Blog | Mar. 5, 2012
Dan Frommer’s framing of the unlimited AT&T data users’ valid complaints as “whining” is so disingenuous I don’t even know where to start. Oh wait, I’ll start right here, with other “finite, constrained resources” that are given away at a truly unlimited rate:
- Cable TV
- Netflix/Amazon Prime
- High-speed internet
I feel the last one is particularly relevant. Frommer is merely an apologist for cellphone carriers’ greed. AT&T saw startups, app authors, iPhone manufacturers et al getting rich off of wireless internet, while they get pennies by providing a commodity service. But Frommer is mistaken in thinking that the telcos deserve more money because they offer something unique or valuable. They don’t. Amazon, Apple, Netflix, FourSquare and others provide the thing of value, AT&T is merely the dumb pipe it flows down, and the telcos have no more claim to the riches made on the internet than the power company can claim their electricity deserves a piece of that action. You don’t thank the road for the destination.
“Value isn’t free.”
Except when you give away an “unlimited” amount of something for a fixed price, it does tend to devalue it. AT&T thought wireless data was so worthless that they were willing to give away all of it, but now they’re welshing on that deal. Just own up to the dollar signs in your eyes rather than insult your customers.
Frommer isn’t arguing against whining so much as he’s arguing in favor of corporate greed… so I guess he’s defending whiners of another sort (or as he blatantly put it “AT&T needs to make more money” Really, ‘needs’? What happened to just providing good service?). But his defense of AT&T’s greed over its customers’ interests in beside the point. He’s having the argument he wants to have rather than the one that customers are making: that when you call something “unlimited” it should actually be, y’know, unlimited.
By way of comparison: you can call limited data plans “unlimited”, but not say that cigarettes cause cancer. AT&T, like makers of cigarettes, needs to make more money!
UPDATE: Monday Note has a great by-the-numbers breakdown of the carriers’ profit off phones subsidies. But if you want impassioned arguments over weasel words and doublespeak, you can just stay right here.
Blog | Jan. 13, 2012
I thought it was an inconsiderate lout, but we can blame bad UI for stopping a symphony at the New York Philharmonic.
The author of the post argues, I think correctly, “for silencing everything when you mute the phone” but then Daring Fireball runs with it, making the case that mute means mute only some things.
A wrongheaded idea all around, and not only because this muddies the very meaning of the mute switch. If we lived in a world where mute on my stereo mutes everything but the vocals, mute on my TV turned off everything but the commercials, and mute on my laptop speakers muted everything but email alerts, then you might have something.
Gruber argues “there’d be thousands of people oversleeping every single day because they went to bed the night before unaware that the phone was still in silent mode.” Hyperbolic to be sure, but I hear no similar plea for the thousands of calls missed because people were unaware their phone was in silent mode.
And how should users mute their alarms for “edge cases” like going to the symphony, or a movie, or a lecture, or a quiet dinner? By disabling the alarms individually? What of the “thousands of people oversleeping every single day” because they went to bed the night before forgetting they disabled their phone’s alarm?
Hmm… if only there were a simple hardware switch for turning all the phone’s sounds on or off!
Blog | Jan. 10, 2012
EA leads another unpleasant trend in video games, this time requiring a separate unlock code for online play. They saw the income generated by game resellers like GameStop and decided they wanted a piece of that action. So now you if you resell your copy of Uncharted 3 (Or Battlefield 3, or Dead Space 2, or Madden), expect it to get less money for it, because the game’s next owner will have to separately purchase an online play code if they want all the functionality of the game they just bought. That is, if they’re able to play the game at all.
Which is why I’m glad to see Club Nintendo, which doesn’t punish you for reselling your games, but rewards you with stuff for registering the ones you bought new (and after filling out a survey, that is). It’s a reward system for buying new games, and I’m glad that at least Nintendo doesn’t go out of its way to cripple its own product. I just registered two of my Wii games, and am going to go back for more to see if I have enough points for those sweet Legend of Zelda posters!
Blog | Jan. 5, 2012
Macrumors mentioned Reports of 50″ Apple Television in Jonathan Ive’s Lab and by now I’m worried that, like shoehorning Superman into Dark Knight Rises, something dumb is going to happen because enough insane fans demanded it.
So far, the evidence we have of Apple building a TV are rumors of a 32″ TV, the aforementioned Brit who has a 50″ TV at work, and hundreds of Apple bloggers who need a persistent rumor to talk about.
While I maintain that Apple doesn’t need to make a TV to succeed in the home entertainment space anymore than Microsoft needs to make an Xbox TV, holy shit does the home entertainment user interface need an overhaul. After my epic journey choosing the right home theater receiver, I tried setting it up as simply as possible, so it wasn’t a mumbo-jumbo of “Select input 1 on the receiver remote, switch the TV remote to AUX 2, and hit power to watch TV. To turn it off, press Off (not Power) on the TV remote, and the third red button under the back cover of the receiver remote.” — except that’s the exact solution I came up with.
Heywood Floyd had an easier time using the toilet. If ever there was a case that called for Apple-like simplicity, home theaters are it.
But here’s the situation we’re stuck in. Just as commodity PC and cellphone manufacturers choose to compete on specs, so do home electronics retailers. Samsung, Sony, LG, et al aren’t copying Apple’s simplicity of design — emulating a UI that got by with one mouse button for 15 years — they’re chasing features. Is the next TV internet-ready? Does it have WiFi? Support for Netflix, Facebook, Spotify? It’s a league of followers. Apple made a smart device with apps, therefore apps are the new thing. Nevermind the one most important thing Apple brings to consumer electronics is also the thing no one is crowing about copying: simplicity.
I want a TV that is easy to setup and use, that’s it. Just how Apple got by with a one-button mouse, home theaters should be managed by one remote; an easy way to manage my inputs so they just work. I want my TV picture to look like it’s supposed to, I want my movies to look and sound the way they do in the theater. I don’t want to have to scratch my head over whether it’s supposed to be Dolby Pro Logic II or THX, or whether I’ve set my TV to 60Hz or 24. That is the one huge target that every manufacturer seems to miss. But enjoy your web-connected TV apps, when you use them.
At the end of this, I have to wonder why current manufacturers are scrambling to keep up with Apple’s magical new TV, and also wonder why it always has to be Apple to come in and reinvigorate some ossified industry? You guys have been making flat panel TVs for twenty years, can’t you make one that’s simple to use?
The tipping point for me was when I recently helped two of my older friends setup their TV. As far as they knew, their picture just looked “funny”. Diving in, I was baffled by the array of options, their seemingly scattered placement over the controls, and the mix of settings for deep-level experts and casual viewers both. So I messed with the picture settings, the refresh rate and so forth, until I solved it. Everything looked fakey because the TV was playing it back at 60Hz, when it was a 24fps film. Turning that off got it back to normal. Since it was a Sony BluRay player connected to a Sony TV, why not have the BD player tell the TV to play back at the same refresh rate as the source video? Why are other refresh rates even an option?
Same goes for the picture color settings: there’s the TV setting, the movie look, the high-contrast look (if you want everything you watch to look like Saw), and settings for tinting the picture cooler or warmer to varying degrees. This is a TV, not a Hipstamatic. So after 15 minutes of trying to get the Mos Eisley Cantina to look right, I’m wondering why we need all these settings? Though he seems to always change his mind about what’s in the frame, surely at some point George Lucas had decided that he wanted the cantina walls to be a certain shade of rose, or cream, or tan. Why does the TV give me the option to watch with my own color settings? Amongst the variety of color settings, I see several wrong options for recreating the picture as it was intended. This is a case where more choice is a bad thing.
All I needed was a choice of what my ambient light is: 5600K? 3200K? Even color temperature numbers don’t mean anything to consumers. Why not the sun and lightbulb icons like on my camera, or automatically detect the color temperature of ambient light with a sensor? Oh right, that would cost extra.
Other settings are a solution to a problem that shouldn’t exist in the first place. Frame blending, interpolation and juddering, specifically. While a de-juddering menu is meant to remove the “triple puck effect” of fast-moving objects appearing before and after their actual position, de-juddering creates a problem when you have repeated shapes moving past, like a fence or flight of stairs, which the TV incorrectly perceives as juddering. Turn de-juddering on and the steps become a static halo, turn de-juddering off and you have the triple-puck effect.
Why do we have juddering in the first place? If you’re watching a 60Hz signal on a 120Hz TV, the set should have two chances to draw the video per frame. There is no reason for fore-and-after images of a moving object if the screen is really being redrawn twice per frame. My guess is the screen isn’t refreshing the entire picture per frame. Hard to believe, a little white lie from the same people who brought us dynamic contrast ratio.
I mean, if you can’t do something, just say you can’t do it. Don’t try to patch it over with stupid features used to fake a good picture when you can’t provide the one you promised in the first place. Maybe that’s the nut Jobs “finally cracked” about the home entertainment industry.
Who watches the television?
Next Page »