Category Archives: Technology

End the suckage of 2016 with fonts from the 2017 Comicraft sale!

If, at the end of “The Wizard Of Oz,” one of the three freaks who somehow blagged themselves onto Dorothy’s warrior quest asked the Wizard for “an ineffable and infallable sense of visual design,” he would have responded thusly:

“My lad, I have read catalogues, advertisements, book covers, movie posters, and product packaging. I have attended design conferences and watched endless keynotes from the best minds that ever escaped from Madison Avenue, who, when confronted for the first time with actual reality, could only speak in adverbs. Softly. Continue reading

Worn Out

Microsoft Band, Apple Watch, and the Moto 360.

Microsoft Band, Apple Watch, and the Moto 360.

My Apple Watch arrived on Thursday, and my unswervable sense of duty forced me to just shove it aside and keep working on something that was already going to post later than I would have liked. But! It was duly unboxed and set up Friday night and I’ve been wearing it ever since.

…As well as Microsoft Band, which I’ve been testing for a few weeks.

…Which leaves me wondering what I’m going to do with my Moto 360, a wearable that I like enough that it’s been my daily wear since October.

…And then there’s also my Swiss Railway Watch, which I still like a lot and wish I wore more.

Well, I do have two ankles that aren’t contributing anything to my digital lifestyle.

My immediate plans for the Apple Watch include a quick first-look for the Sun-Times and then (yeeks) at least three weeks of daily wear before I even consider writing up my formal take on it. It’s important to get a lot of serious “deep soak” experience with a device as fresh as this one. I’m a bit suspicious of reviews that land so quickly after the unboxing. Both the Moto and the Band seemed almost laughable as sneak-peek promotional videos and they even made weak first impressions on me. By the end of the first week, though, they had totally earned my respect.

I’m actually grateful for this time with Band. It wasn’t the first thingamabob I’ve tested that captures sleep data, but it’s the first one that presented that feature in such a way that I actually use it. I wake up, click a button on my wrist, and get my “score.” I wish it could go into sleep-tracking mode without pushing a button, but even that limitation seems like a feature: the Band is the final screen I switch off before going to sleep, and it gives the process a formal sense of ceremony. It’s like a formal command to my brain to (please, for pity’s sake) just give up and switch off.

That’s why the first one that made me realize that (holy mother of God) I need to address my sleep deficit. “Deficit”? No, it’s practically a sleep disability. Last night, I felt my usual impulse (reinforced by decades of behavior) to just keep right on working until three or four AM. But the memories of my pitiful sleep score from the night before were still fresh, so instead, I found myself turning off all of the lights and screens and sources of noise, and then hopping into bed at 1. Like some sort of farmer!

Which illustrates the special role that wearables play. Desktop computers are the things that you move to and sit down in front of. Mobile devices are devices that follow you wherever you go (even into the can). Wearables are different from both: they’re devices that do things for you even when you’re not interacting with them at all. It’s taken forty years, but we finally have computers that have a totally servile relationship with their users.

Tegra Gives Good Demo

You can’t say that an actual market for slate computers exists today. “An actual market” implies that “there’s actual competition.” The only competition in tablet space right now is between the Verizon and AT&T versions of the iPad. And unless HP reaches into the WebOS bag and pulls out the god-damnedest rabbit you ever saw this summer, that’ll be the state of affairs through the rest of 2011.

But Apple won’t own this product category forever. Someday, someone’s going to figure out how to make a tablet that’s so good, so compelling, that even when it’s placed side by side with the iPad, consumers won’t be able to choose between the two without factoring in things like “Well, the iPad comes with free stickers…”

I don’t know when that’ll happen. But it looks like when it does, those Android tablets will have some lovely processors. NVIDIA has posted a video demonstrating the abjectly insane performance of their next-generation mobile CPU, the sequel to the Tegra 2 processor that sits in some of the best Android tablets on the mar… — well, let’s say “available for sale” — today.

This new CPU has four cores and it looks as though it runs like hot sick:

Those of you who chose not to watch the video didn’t see a demo of a game running on an Android 3 tablet built around an engineering sample of the next NVIDIA CPU. By tilting the tablet, the player rolls a marble around a fully-rendered 3D table. The marble is a light source. It knocks over barrels, which are also light sources, and they collide with each other and the table naturally. It moves through curtains, which ripple realistically and are rendered with convincing translucency. All the while, lights and shadows and reflections and physics are being rendered at a smooth framerate and in realtime. It all looks gorgeous.

Those of you who did watch the video are now thinking “Holy ****! The marble is a light source! It’s knocking over objects that are also light sources, and all of the light and shadow and reflection effects are rendering gorgeously, at a smooth framerate and in realtime!”

It’s a hell of a nice demo. Instinctively, I wonder if we’re seeing the true, overall performance of the CPU, or merely how well its 12 GPU cores can handle 3D graphics. When the demo game turns off two of the tablet’s CPU cores (ostensibly mimicking the performance of current-generation processors), the game completely falls apart. So there’s that.

Can I rescue my cynicism? Oh, easily. That smokin’ hot power is useless if it drains power like a teenager shotgunning beers at a graduation party. The demo also doesn’t say anything about how much heat it generates or how big it is. I’d expect that NVIDIA is aiming for the same overall specs as the Tegra 2, but the point here is that raw performance is only one metric of a mobile CPU.

NVIDIA claims that the first tablets with this next-generation CPU could ship as early as August. Mmmmmokaaaayyyyllllllet’sssseebouthat. But 2012 is looking interesting…and my mind boggles at the thought of a handset with this CPU. Yes, the gaming would be majestic. But think about this kind of power in a 4G handset. Today’s car navigation apps can download and refresh a 2D map and it’s often impossible to correlate what’s on the screen with what you’re seeing through the windshield. Imagine a phone that can download Google Maps 3D wireframes and textures and render, in realtime, the camera view of your position from a virtual chase helicopter. Driving through a city be just playing a videogame, only without the ability to take a shortcut through the glassed-in atrium of an office building.

Then I take a step back and remind myself that the limiting factor on Android devices has never been the hardware: it’s been the software. Google hasn’t shown much of a knack for supporting and motivating developers to create truly ambitious, world-class software. Hell, I’d be happy if they could even motivate themselves. When I tap a button in an Android app it should feel like I’m tapping a (goddamn) button. Instead, it feels like I’m filing and submitting a requisition form for the function I wish the app to eventually perform…and the person who needs to sign it has already left for the weekend.

Still! Good things ahead from NVIDIA, eh? This new CPU is just the next chip in a multiyear plan. Their current mobile CPU is the Tegra 2. This new one is code-named “Kal-El.” The next ones, due to be released from 2012 to 2015, are Wayne, Logan, and Stark.

You understand now that even without that impressive video demo, I am forced to love NVIDIA lots and lots and lots.

(I feel a discreet tug on my shirtsleeve)

(A reader whispers that he doesn’t understand why this is.)

(I explain that the CPUs are named after Superman, Batman, Wolverine, and Iron Man, respectively.)

(The reader thanks me for the explanation, and also for being so discreet about it and allowing him to keep his dignity.)

(I reply that it’s perfectly fine, I wouldn’t want him to be humiliated publicly for being the only one not to pick up on all of that.)

There’s just one bit of unfinished business: the original Tegra processor doesn’t fit in to the naming structure. I don’t know much about how it got its name. It’s certainly safe to assume that they paid some company the usual six-figure fee to break a huge list of Latin words into their roots and then pick twos and threes out of a bag until they had a name that (a) perfectly encapsulated the strength, speed, and reliability that would be associated with NVIDIA’s new dual-core mobile CPU and (b) wasn’t already trademarked, nor similar to the street name for any horrible drug that’s being pushed to middle-schoolers.

But the name is tantalizingly close to that of Tigra, the half-cat, half-human hero in Marvel Comics’ various “Avengers” series.

NVIDIA should retroactively rename the current processor to preserve the flow. Naming all of your CPUs after popular characters is one way to honor and celebrate the traditions of the comic book industry…but unapologetically rewriting established history to avoid inconvenient inconsistencies with what you’ve got planned for the near-future is an even better one.

Bonus Inside Joke exclusively for comic book fans:

If NVIDIA formally changes the name to “Tigra,” then every time a piece of software caused an Android tablet to crash or suffer a huge performance hit, we could say “Man! This new app is totally Bendising the CPU!”

(Thank you.)