Posts About Sparks
Posts About Sparks
We recently noticed something astonishing. 817 different mobile devices accessed ZURB.com in the past 30 days. That's amazing! Having a flood of smartphones and tablets access our site proves that responsive design isn't a choice anymore.
The number has grown by leaps and bounds in the past couple of years. We recorded 142 mobile devices coming to our site when we release Foundation in 2011. Now it's 817 — a 475% increase in just two years.
These aren't users who casually check out the site and leave after a few seconds. The broad base of them are sticking around for a minute and 55 seconds. Sure, it's lower than the four-and-a-half minutes of the average desktop users, but mobile users are snackers.
It's only going to continue to grow. Consider this: 38% of mobile usage in the last six months came from 3,430 different devices and operating systems. Tablet use has shot up 10% in that time as well. Which goes to prove that responsive design is crucial to meeting the needs of customers across a variety of smartphones and tablets.
It's impossible to design for every specific device. Responsive design is the only way to ensure our products do reach as many people and as many different devices as we can. It's the only practical solution that allows companies to serve up content to a growing mobile audience.
Alright, they make you look more than a little like a cyborg. Wearing them out in person is awkward and makes you weirdly self-conscious. But you can't deny that 'Glass' from Google is fascinating technology. And Foundation works pretty well on it.
We were lucky enough to secure a pair through the #ifihadglass Explorers program, and we've been playing around with them all week. While people have written ad nauseam about the experience of wearing Glass, we had a more specific thing we wanted to play with: exploring the web through them. Amazingly enough, it's not a terrible experience.
Accessing the Web
Almost everything on Glass is voice-cued, so to reach a Website at all you need to tell the device to Google something, for example "Product Design Company in Campbell." After showing you the Google results you can elect to View Website, an action made available in the last Glass update.
Glass does render arbitrary pages, and does a … pretty good job. Performance-wise Glass is about on par with a 2-year-old Android device. Pages render somewhat slowly and anything that requires significant hardware like CSS transitions (with a decent frame rate) is going to suffer quite a bit. That being said it is a modified version of Chrome and it renders pages correctly in most cases.
Using a Page
When it comes to using a page, the controls are intriguing and pretty cleverly done. Using the touchpad on the side of the device you can easily scroll up and down by sliding your finger along it. Sliding along with two fingers zooms in and out (a very choppy affair). Where it gets cool is when you hold down with two fingers and just … look around.
As you move your head, the built-in accelerometer translates your head's movement into both a means of panning around the page, as well as the means of targeting a link to select. A reticle in the center of the screen shows your target, and tapping on the touchpad gives you an option to select and follow the targeted link. It's a crafty way of using a natural action like looking around and mapping it to the Web. It works really well, even if you look like an aimless idiot to anyone watching you.
Foundation + Glass
Once we learned how it worked and what it displayed we were pretty confident Foundation 4 would have no issues on Glass. The pages it shows are presented as mobile screens which makes sense; Glass reports a resolution of 640x360. Foundation 4 being mobile first, pages are rendered as small screen pages using the small device grid.
We found one issue with Clearing which has some trouble with how Glass handles the canvas panning, and of course some things like Joyride can be a bit tedious to pan around and deal with, but by and large Foundation 4 has you covered for Web sites being shown on Glass. We'll do some more testing and verify this but we were pretty happy that our thesis for Foundation, being ready for devices that don't exist yet, is holding up. Foundation 4 was ready for Glass. Pretty neat.
Velociraptors have been hard to spot in recent years. You can thank the downfall of that Jurassic exhibit off the coast of Costa Rica. (Really, what were they thinking?) After that, raptors were practically extinct. Unless you know the Konami-Code. Then you might be lucky to see one in the wild of the interwebs. We recently had a raptor sighting while browsing Vogue's UK site and punching in the Konami-Code.
A few of the Raptors spotted on Vogue UK. Don't ask what we were doing on there. We just like fashion and the web. Stop judging.
Anyhoo, we were delighted to discover that this raptor took after Madonna. She came dressed to impressed, wearing some stylish headgear. A new chic raptor loads each time. So far, we've found six. How many more can you find?
A Raptor of Your Very Own
While we don't know how Vogue corralled their Raptor, we can help you capture your very own for your site. We have a jQuery plugin that will allow you to do the same thing. You can make your site Velociraptor ready with screech included — just like a "Jurassic Park" action figure.
Apple's new iOS7 is a great step forward full of useful innovations and a great visual refresh, but their work has been marred by the design details that people simply can't look past.
Tim Cook unveiling iOS7 to the WWDC audience. (Image from Apple)
The other day Apple unveiled a series of updates to their products and operating system to a room full of designers and developers, as well as to those clever enough pull up apple.com on Safari. While the updates to OSX, Mac Pro and Apple Air were met with open arms, the highly anticipated iOS7 wasn't. Some people welcomed the change. However, there was an overwhelming amount of people who told it to go back home (and they made sure to echo that sentiment via Twitter.)
iOS7 doesn't need to revert back; rather, it needs to take what it has and do some refinements before it launches this fall.
A Much Needed Evolution
iOS needed to update. We all knew that. When it was released, realistic elements were needed to familiarize users with the device. There had never before been a fully touch-screen phone. The little skeumorphic touches — such as popping the keyboard keys off the background to imitate an actual keyboard or a notepad with ripped papers — helped people understand what they needed to do.
It was intuitive.
Apple carried iOS's realistic visuals to the iPad, which, again, needed some familiarizing since its touch interface was a such a new concept (amidst critics touting it as the 10" iPhone). But how else would users instinctively know that turning a page in iBooks was as simple as swiping the page, similar to how you would in an actual book?
Flash forward to today. Almost six years later. Those skeumorphic details still exist, but we've been trained to use them. We know that, yes, we tap on things to select something, or that we swipe to move an item or progress somewhere further. These gestures are natural for the everyday user, so all those extra hints (look, it's a compass — tilt it to find where north is as you would a normal compass) aren't as necessary today as they were in 2007.
iOS7 — The Start of Something Great
When iOS7 was announced, we were ecstatic. This was a major facelift — a redesign worthy of setting new standards in the industry, and one that introduces much more intuitive gestures to some of the most common actions today's iPhone users do.
For example, to fully quit an app in iOS6, users must access the multi-task panel, tap and hold the appropriate app icon until the close "x" appears, then tap that "x". To fully quit an app in iOS7, users access the multi-task screen, find the appropriate icon, and swipe up. Changes like that may seem minor, but they're the ones that make a product go from good to great.
Everything in iOS7, from the thoughtfully crafted animations to AirDrop to the new icons, have been redefined, and we're huge fans. It's definitely the direction Apple needs to continue delivering delightful, highly-anticipated products to their customers.
Multi-tasking in iOS7 is more visual, and it's easy to switch to another app (tap), view open apps (swipe left/right), and fully quit one (swipe up). (Image from Apple)
But Those Icons … The Horror!
With every release, however, comes critics. Seconds after the announcement came a flurry of tweets proclaiming blasphemy in the form of the app icons. "They're too vibrant," "The gradients are too harsh," "What's up with Game Center?" Within a few hours and Dribbble was filled with solutions to these criticisms.
These were all valid arguments. The line work wasn't consistent, a few gradients were more severe than others, and, aside from the colors, the icons didn't feel like a cohesive set. But we think that's OK!
Think about it. If you own an iPhone, iPod Touch, or iPad, when was last time you only had the stock Apple apps? These icons will be living next to every other app you have (or in some folder anyway), and surely they all don't fit a single style and color palette. Vine's icon is fairly flat with a subtle text shadow while the Instagram icon is extremely rich in detail.
And hey, the Newsstand icon is a nice upgrade!
Example of the new icons mixed with other icons.
Wait, Look at That UI. What a Lousy Execution!
It's fair to argue that. While the interactions are more intuitive in this upgrade, the visual execution suffered a little. Apple's pixel-perfect, leather-stitched designs were swapped with a flatter, more streamlined design that came with a few flaws: visual weights of the icons aren't balanced, line weights vary too much, etc. "Simplicity is actually quite complicated," as they've noted. But it's Apple — we expect perfection.
That's where iteration comes into play.
While the icons in Mail work nicely together, the back/forward arrows in Safari feel unbalanced next to the other icons in that tab bar.
iOS7 is out there for all of us to touch and test. Now Apple must use feedback from early adopters to refine the operating system before its fall launch. The interactions are there. It's the visual refinement that needs to be addressed.
You Go, Apple
To create [iOS7], we brought together a broad range of expertise from design to engineering. With what we've been able to achieve together, we see iOS7 as defining an important new direction, and, in many ways, a beginning. Jony Ive
Jony Ive sums it up perfectly. It's a beginning. It's a beginning of great forward-thinking design that focuses on users and thinks two steps ahead for them. It's a beginning of rethinking standards, and a beginning of a new voice emerging from Apple.
Everyone's going to have an opinion about iOS7. As for us, we fully support the direction Apple's going. We're all about pushing products forward to make them more usable, as well as to have their own unique voice. For Apple, that voice has been established with its iOS7 design (and that voice comes in male or female).
You go, Apple.
The last few Soapbox speakers that we've had have all spoken in someway about how it's important to get your story straight. We often chat about story around ZURB. Mostly, it's about telling the ongoing story that we've built over the course of the past 15 years and how we can do that better. That is try to find stronger ways to explain who we are, where we come from and what we stand for.
Recently, we ran across a New York Times article that broke down how important storytelling is to the morale of a family. And what is a company but another kind of family? What really caught our eye was this little bit:
... if you want a happier family, create, refine and retell the story of your family's positive moments and your ability to bounce back from the difficult ones.
Think about it for a second. It's more than just retelling that story, it's about refining it. That's something not only for families, but for products as well. Mostly, it's for those building your products.
Story is Vision
Another way to think about story is vision. What's the vision you're trying to convey. What is it specifically that you're trying to achieve. YouSendIt CEO Brad Garlinghouse said at his Soapbox that having a good vision and telling it well requires knowing who you are. We add that you have to also articulate it well.
Brad knows all too well the consequences of not being able to do so. After all, he wrote the Peanut Butter Manifesto on why Yahoo had lost its way, trying to be too many things to too many people. Along the way the search engine forgot who it was. And, at that time, employees could no longer effectively rally behind the each other as family. They had nothing to fight for anymore.
Brad used YouSendIt as an example, saying the company "is going to empower you to share and control your content like a professional." We too can say who we are succinctly. We're a close-knit group of product designers who help companies design better products faster. We've known that for quite some time. We've also been able to articulate our vision for the world. And we work hard to continue to say it well, figuring out better ways to communicate it, whether its through words or pictures.
Tell Your Story Better
In order to tell you story better, you have to be able to do the following:
- Write it down. A story isn't good unless you put it into written words. Keep a doc — Google or otherwise — around so that others can refer to it. You can even write it as a Dreadful Mission Statement.
- Make it real. Writing down your story also makes it real. It becomes tangible, something others can touch. But more than writing it down, it has to also be actionable to be real. Is it something that can be accomplished?
- Speak from the heart. If you can't speak from the heart, if you don't believe in it, then others won't either.
Knowing who you are and what you stand for is half the battle. In order to keep telling that story well, you've got to refine it, make it better not only for your customers, but for your employees as well and gives them something to rally behind as a family.
We've been having more and more conversations around content and mobile lately. Last week, while we were in the midsts of releasing Foundation 4, Bryan made an observation on how we have to take a critical eye to how content is placed in a mobile context. Which brought to mind, does that mean mobile also changes the way we write that content?
The other day, we came across LukeW's notes on Karen McGrane's talk at An Event Apart, "The Mobile Content Mandate." What particularly caught our eye was this bit in Luke's notes:
There is no such thing as writing for mobile. There is just good writing.
Mobile is a catalyst that forces you to write better, more concise copy without sacrificing clarity, Karen stated. There's no need to write separate copy for desktop, tablet and smartphones. If the content is well-written and engaging, it can carry you from device to device.
After all, well-written content in concert with form elements and visuals can make a page more desirable to use, regardless if it's on a desktop browser or a mobile one. But what makes good copy? SEOmoz says that great content has the makings of:
- Credibility: Think of this as write what you know. Better yet, write to your expertise, your strengths.
- Real effort: It really shines through when a post is well-researched and the writer has put time into it. That care shows.
- Actionable: This is the takeaway that urges readers or users to take action
- Begs to be shared: Good content is something that needs to be shared, but you should also want and be proud to share it.
But that's not to say there isn't a need for a mobile strategy, that you don't have to plan out how best to structure your content. It's a good rule of thumb to have just that. However, what Karen is saying, in the end, is that if you have all the makings of great copy, you don't need to write specifically for mobile. That your copy will transcend the device it's read on.
We've moved beyond devices. Our smartphones, our tablets have become extensions of ourselves. And the most perfect example of this is Google's upcoming glasses Certainly, it's the most organic, electronic extension. Yet it might be the most limiting.
WIth Glass, we'll see the world slightly differently. Products, apps will all be within the blink of an eye. Now Google has given us a peek into how folks will interact and use the device. Check out this video and notice how it might not be as liberating as you might think:
A lot of the interactions are through voice, which could hinder where Glass could actually be used. Sure, it'll be terrific in a car, where we're handsfree. But we know how frustrating using voice commands can occasionally be (we're looking at you, Siri). For example, Siri:
That's not to say that they aren't advantages. There are. Like we said, it's convenient for remaining handsfree while driving, especially when sending text messages. You don't have to use your fingers to actually use your phone. And Glass seems like the next step in this handsfree evolution. But you can't use voice all the time.
Think about it. You're in a library and you need to Google something. It becomes difficult to use voice commands, you might disturb others. Or the reverse, you're at an outdoor concert with wind, crowds and speakers. The noise might render your Glass useless. It wouldn't be able to make heads or tails of what you're saying. You won't be able to tell it to record the concert or snap pictures of your friends. And for those of us that wear prescription glasses, we might be out of luck (although, we could foresee a prescription model down the road). And using them for scuba diving might be a bit difficult.
Right now, it also seems that you'd be completely reliant on voice, so you won't be able to manipulate data. Although, that might not be too far off.
While Glass might seem limiting, it's still exciting to see this one step closer to being in our actual hands. For all it's limitations, there's plenty of more opportunity opened up by it. How do designers work around these limitations? How do we build products where we can manipulate data or tools without the use of our hands or gestures, only our voice?
Limitations are only constraints by any other name. And those can only force us to design smarter and can actually be liberating.
Remember in "Star Trek: The Next Generation" how Captain Picard's desk was littered with dozens of tablet computers, all of various shapes and sizes, for a variety of tasks? Seems kinda silly nowadays when the iPad came out. One device that could do everything ... nearly. Now we have different shaped iPads, Kindles and other tablets.
Not so silly after all. But what's more surprising is how much the tablet is becoming our go-to computing device, so much so that it's slowly stomping out other devices. Here's some numbers recently collected from our good friend LukeW:
- E-book readers, solely meant for reading, will plummet this year to 14.9 million shipped, a 36% decrease. By 2016, that number will drop even further to 7.1 million. Tablets that do more than display books are to blame.
- Notebooks with Windows inside dropped 24% compared to last year. The finger is firmly pointed at tablets and, of course, our smartphones.
- Tablets are expected to outpace smartphones this year in sales, 9.4% of the 15% of all US mobile sales.
Luke also notes that folks aren't buying devices dedicated solely to one purpose in their lives. That is a no-frills cell phone, a basic "point-and-shoot" camera, you get the idea.
So What Does This Mean?
One thing is that we're expecting more and more out of our devices. We want them to do more. We're no longer satisfied with devices that live for solely one thing.
You might be thinking, "whoa, hold your horses ... does this mean we should start designing solely for tablets then?" Nope. We can't go backwards to the days when we designed specifically for one device. Just because tablets are taking the lead doesn't mean that the tablet will rule the device nest. Quite the opposite, this means now more than ever we need to design for multi-devices in mind.
Here's why: there are still dozens upon dozens of tablets on the market. There's the iPad, it's mini-me, the various Kindles and who knows what else on the horizon. Speaking of tomorrow, there are also smartwatches and, of course, Google Glasses. And we doubt that anyone will toss out their smartphone anytime soon. Well, unless the Blackberry makes a come back (kidding).
Numbers like these just go to show us how imperative it is to design for multiple devices, to do so responsively. It's the only way we can meet the hungry of an ever-growing mobile audience.
Devices are changing quickly. What we use today will no doubt change in the years to come. How we interact with them will be different. Just think about how we used cell phones five years ago compared to how we do today. Things are moving at warp speed. The future is here. We're building it now.
At the start of the year, we tapped into our inner soothsayer and featured what we saw coming up in the near future for responsive design. We highlighted things a little ways down the road, like devices in our ears, our cars and our eyes that could change how we approach design.
Now there's something else to consider — paper tablets. We ran recently into an article and video that featured these sheet-thin devices. Take a look:
Smart paper has been made. One day it could be in our hands. Sure there are things to work out, like having the processor housed within the sheet. Right now, it's housed elsewhere and the device has to be plugged in.
We talked at the office that it looks incredibly hard to use. So there could be some usability problems yet to solve. Typing, for instance, might not be the best for that format. Maybe it's better suited for writing with a stylus.
That being said, within five years our iPads could give way to iPaper.
A flexible screen makes responsive design even more of an imperative. Imagine how people will use your products and how they'll look on a screen that they can roll up into their hands like a newspaper. The possibilities are endless. Let's just hope folks don't use their smart papers to swat flys with.
Happy New Year! 2013 is here and we can't help but keep thinking that we're really living in the world of tomorrow.
Think about it, science fiction is rapidly becoming fact as technology achieves what only writers could once dream. And you don't have to be a soothsayer, or a science-fiction author, to see that wearable devices will one day be an everyday reality for all of us.
We've all seen the nifty video that showcases the possibilities of their Project Glass. That video only gave us a taste of what could be achieved with a wearable heads-up display, and certainly was more wistful thinking than potential reality.
But what does this all mean for how we design products in the not-so distant future?
Your Eyes Will Do the Walking
Jonathan has previously pointed out that wearable devices, such as Google Glasses, are really extensions of self. That the devices will merely disappear as we have more integrated forms of interaction where content is paramount and data is ubiquitous.
That's something the Google Glasses design team is firmly working toward. Babak Parviz, head of the the project team, recently gave some hints on what's to come. Here's a few key points:
- Augmented Reality: Isn't the immediate goal, but will eventually "come into the picture."
- Interactions: Still working out how people will interact with the Glasses, from using a touch pad to voice commands. They've even experimented with head gestures.
- Sharing: A main goal of the project is to allow people to share videos and pictures with one another, to share how they view the world through their eyes.
- Apps: While the team is still figuring out the interactions on this new platform, there will be a cloud-based API for developers to integrate with the Glasses.
These tidbits all suggest that our eyes will very soon do the walking on the web, which creates some interesting design challenges. Let's take a look.
Our products will one day no longer be confined to a display screen. With a heads-up display, our products will be right in our user's eyes. That means we'll have to consider how best to present content on those devices.
Or as the Nieman Journalism Lab recently put it, we'll have to ask:
How does this look jammed right into a user's eyeball?
The lab may have been asking in terms of news organizations, but it's a question that all product designers will have to consider. We've already begun moving into a mobile-first design ethos, but will there come a day when we take an eyeball-first approach?
Voice or Gestures
Another consideration will be how we actually interact with devices without a touch pad. We're already aware that true hover states don't exist when it comes to mobile devices. We also have to take into consideration the size of touch targets when it comes to fingertip actions.
Now we're in the early stages of voice commands with things like Siri. Researchers are fast at work at the use of spacial gestures. With a wearable device, the day will come when our interactions won't be trapped in four corners.
How We Get There
While Google Glasses may have its own line of native apps, users still expect to have the same content parity across all devices. And whether it's on a smartphone, tablet, desktop or Chrome for Your Eyeball, users will still want the same functionality. Which responsive design does allow for.
In other words, responsive design is the first step to meeting the challenges that wearable devices present. We have a lot of work to do in the coming year. But tomorrow is no longer around the corner. It's here. And we may all soon have responsive design in our eyes.
Read up on the latest