Bill

This Soapbox has come and gone!

  • Bill Scott, VP at Meebo

We hope to see you at the next Soapbox. We'll update this page soon with interesting tidbits about the event plus the podcast!

About Bill Scott

We all had an amazing time at the ZURBsoapbox with Bill Scott last Friday. We had a great turnout of folks, lots of sushi, and a passionate discussion about interaction design. A few ZURBians even won a copy of Bill Scott’s Designing Web Interfaces book! What more can you ask for?

Listen to Bill's Podcast

Subscribe:
iTunes RSSRead Transcript

Anti-patterns - do you have any?

Bill started the discussion off by mentioning that all interfaces are made up of hundreds of "interesting moments"— minute interactions which make or break the user experience. Many times people don't pay attention to the interesting moments which results in anti-patterns which make the UI confusing and hard to use. Here are a few great examples of anti-patterns Bill shared:

  • Intuit's TurboTax: Watch what happens as you enter your interest paid for the year on your federal tax returns. The tax amount you owe the government skyrockets and then goes down. This nearly gives the user a heart attack.
  • Yahoo! Photos site: Watch what happens as you drag the picture into the folder. Dancing Hampsters (as Bill calls them "idiot boxes") pop up. This interrupts and confuses the user. Golden rule of any UI is to stay out of the way and let the user complete the task.
  • Barnes and Noble: Interesting example of the endless scrolling of the book selector. The selector ends up being a conveyor belt instead of a carousel.

Netflix: What made the biggest difference

Bill mentioned a number of factors which contributed to Netflix's killer user experience. Some of them included:

  • Getting rid of the synopsis text on the search results page and putting it into a tooltip description ended up helping folks find movies with less work.
  • Making the box icons of the movies larger ended up improving the user experience and consumption.
  • Adding transparency messages such as: "We picked movie X for you because you watched movie X" helped out a great deal. This built trust which prompted engagement.
  • Hiring people to micro-tag each movie ended up creating very targeted tags for movies. The response from users was: "Wow! These Netfix guys really know me!" This increased the consumption a great deal.

Netflix: Tell us about a mistake you made

Netflix had a goal of increasing movie consumption by increasing the number of people who tell the site taste engine the movies they like to watch. More information about the user's taste leads to more movie recommendations based on user preference which leads to increased consumption. Simple enough?

Here is what ended up happening:

  1. Netflix asked the new users to rate the movies they like
  2. During the 2 week trial the new users spent most of the time inputting their rating movies instead of watching movies
  3. As a result the retention dropped significantly since users did not perceive the value

What was the solution? Netflix simply moved this movie rating control into an area of a website where most existing users go. The new users ended up spending up most of their time watching movies instead of rating them.

It's hard to summarize 25 years of experience in a 25 minute discussion. Bill did a phenomenal job highlighting the techniques and best practices behind the Netflix UI. We'd like to thank Bill once again for such a fun and insightful talk!

Don't Miss Out on Our Next Soapbox

Transcript

Bill: I like to think of interfaces as having lots of interesting moments, magic moments, if you will. We might create six or eight variations on an experience. The reason you get those two dialogue boxes which I like to call idiot boxes is because the interesting moments weren't paid attention to.

Moderator : All right. Well, thanks, everyone, for coming to ZURBsoapbox. We have just a couple of reminders for the next couple of events we have coming up. We have Luke Wroblewski stopping by June 18th.

Bill: Go, Luke!

Moderator : He's the chief designer and architect/designer at Yahoo and good friend of Bill Scott. He'll be stopping by and talking. He'll be giving the same that he gave at an event in part about smart phones. That's coming up June 18th.

July 9th, we have Nate Bolt of Bolt and Peters, a user research guru. He's conducted hundreds and hundreds of user research studies for Oracle, for Electronic Arts, for Sony, HP, many different companies. He'll be stopping by to talk about user research, remote user research July 9th.

If you guys haven't signed up for the ZURBsoapbox e-mail reminder, we'll have a piece of paper there on Amanda's desk, that big desk here, to put your name and e-mail, and we'll shoot that e-mail reminder to you for the next event.

So, we're stoked to have Bill Scott here, Bill Scott of Yahoo where he curated the Yahoo pattern library which if you guys have not check out, I strongly suggest you do. It's an awesome resource for any designer person who's trying to solve a problem with a pattern.

Bill Scott of Netflix, of course, which all of us love the user experience, and we use every day. I know I just e- mailed, and I just sent my DVD back the other day, and I'm looking forward to my next one.

Bill: Did you keep it a long time?

Moderator : Huh?

Bill: Did you keep it a long time?

Moderator : Just about a week, maybe.

Bill: Okay. The people who keep it a month pay for the service, so you know that.

Moderator : Oh.

Audience Member: I kept one four months once.

Bill: There you go. Thank you very much.

Audience Member: I did my part.

Moderator : Okay. I'll try and do it.

Bill: You're part of the reason.

Moderator : My grandmother uses it, and she's 86 years old, and she streams movies on it.

Bill: That's great.

Moderator : And for the UI, how simple and a killer to use is that? She only uses Netflix. I think she knows Google.

Bill Scott of Meebo Pro just recently joined Meebo where he is facing new challenges there. He's getting their plug-in for the browser going and getting the IM aggregator going as well. Meebo, as you guys know, is one of the world's largest IM aggregators. It was launched in 2005 and open source as well.

If that's not enough, this designing for web interfaces book that he co-authored with the recent . . .

Bill: I'll give a couple away.

Moderator : It's an amazing, amazing resource for anybody. I'm still trying to sink my teeth into it.

Bill: I'm going to step back here.

Moderator : I was just talking earlier about it. It has lots of tools too, how to guide the interface and find little faults with it. His grid, his interesting moments grid that's in there which we'll probably touch on today. It's a great book, the kind of book you probably keep chained to your desk there because people would probably want to use it. It's amazing, amazing.

So, he's going to take his 25 years of experience and . . .

Bill: You can tell he does marketing, totally, totally. I feel like I've been marketed. Thank you, Moderator .

Moderator : He's got 25 years of experience . . .

Bill: Basically, 25 minutes, one minute per year. When I was born . . . no.

Moderator : He's going to touch upon some Netflix, how the design engineers work on the team together; how do you make decisions with that team.

Bill: Yeah.

Moderator : How do you raise morale without trying to raise morale? How do you design by analytics versus designing by principles? How do you stay relentlessly simple? By adding features into the interface, how do you stay simple with that interface? How do you kill features from an interface? All of these things are super important to create that awesome user experience, such as Netflix and other work space web apps?

With that, let's welcome Bill Scott to ZURBsoapbox.

Bill: Thank you. I have a few slides, but I don't even care if I get through the slides. It's basically just to have a conversation. One of the things just to set the stage a little bit, I like to think about interfaces as having lots of interesting moments, magic moments, if you will. You can think of the user experience as a user illusion. Basically, you're trying to create some kind of magic.

You're turning just these bits into something that looks like the real world, and we see it more now with the iPad and other devices. In natural interfaces and touch interfaces, we can see that evolution happening where things feel more natural. You probably have all seen the cat playing the iPad YouTube video, that kind of thing.

Fitzkee was an author of magic books back in the 1940s, and Bruce Tognazzini actually had an article back in the '90s on magic, and he quotes Fitzkee a lot. I actually got the trilogy recently which is pretty interesting. He says, "Magic is both in the details and in the performance". I think that ties directly with user experience.

There's obviously a lot to do with overall flow and information architecture. I tend to focus a lot on the interaction design because I think that seems to be missed a lot. You take something simple like this iGoogle example where you're just doing a drag and drop.

It looks pretty simple, and when I was at Yahoo we were crafting a Yahoo user interface library, the YUI Library, and my role in that was to think through drag and drop as the first thing because it was probably the harder time to get in the tool kit. And think about it from a design perspective and also from an ensuring perspective what would we put in the tool kit?

So, I found myself cataloging a lot, writing lots of notes, 20-30 pages which was kind of silly. I could probably use some design skills in this and actually put it in some kind of a grid which I did because when you start looking at the interaction, there's a lot of events for drag and drop. I think that's just what a lot of people don't, maybe consider when they get into these things, these simple interactions.

There are also a lot of actors that get involved. If you think about the events, even what you can do with the page loads or if you drag back over the home area that you started in. These are all these little subtle things that show up in the drag and drop, and you can do things with the drag object, the ghost, et cetera.

This is really a terrible . . . I should probably used white. There's actually a grid here.

Audience Member: I can see it.

Bill: Yeah. You can see it faintly. There's a grid. The colors aren't showing up too well.

Audience Member: The events go on it.

Bill: Basically, the events go along this way, and the actions run this way. If you actually chart an interaction like that, you can see in this case there's like 96 interesting moments, and there are moments in time during an interaction that you can choose to engage or not engage. You just take that simple iGoogle example. You can see it graphed to this matrix, this interaction matrix.

In fact, I called it interaction matrix, and one of the guys on the team, Eric [Morelia], came up with a sexier name, Interesting Moments Grid. I told him, "Can I steal that?" And he said, "Yes, you may use that". So, I give Eric credit whenever I can. You could use, for example, why don't they just have the arrow here? It would be grab your hand, things like that, probably because nobody thought of it; probably because it just happened by default. It wasn't considered. So, it's just little stuff.

If you look at the evolution of iGoogle, their drag/drop, it was really, really bad to begin with. It's pretty good now. It's decent, but at first everything wiggled around the page and jumped around. What happens is you end up with . . . well, I'll skip that. What happens is you end up, if you're not careful, you can break those magic moments.

This is Turbo Tax. If there's a $10,533 tax bill and you change something, watch what happens to those numbers. It's about to happen now: 10,000, jumps to 98,000 and back down. If you look at it in slow motion, you'll see it here. This is kind of like the Biggest Loser thing happening here, 16,000, 98,000, 33,000, 53,000, 31,000.

You choose to engage with the user in a way like this which is kind of flaky. Intuit's had this for a few years now on the Turbo Tax product, and I don't know why because it doesn't really help you. It gives me heart attacks. Every time I'm doing taxes, my taxes go up to 98,000. Oh my God, did it actually go down? Oh, it went down. I should be celebrating, but I don't have any indication of that.

So, these are the kind of things that I like to ask him with the pattern library and the anti-patterns and stuff. It's really to kind of tease out these sort of things, like from Yahoo Photos where you're dragging and dropping photos into an album The album doesn't light up so you get a dialog box and you get dancing hamsters, and then, you get another dialog box. The reason you get those two dialog boxes which I like to call idiot boxes is because the interesting moments weren't paid attention to.

When you drag over the album, you could light the album up. You could have a number beside the album, how many things are in the album, and it changes when you drop it in, or you could do any number of things. You could put it in a grid, for example. You don't have to, but if you put it in a grid, you'd be able to see that, "Oh, I could interact here."

Instead, they had a rented design problem at Yahoo for the photos team. They were just around the corner from me. At the time they didn't have anybody that was full-time dedicated to it. They kept different people being cycled in. This is about when Flickr was bought. And so, there was some debate about how much to invest in it and such. We had some really great engineers working on it who essentially went onto Flickr, so they had this pent up ability, but they tasked them to do stuff like this, those idiot boxes and things.

And then, you got this amazing plethora of interfaces out there that get excited about animations, animated menus and things that slide. Instead of me actually looking at paint, I'm dealing with animated menus. I've been championing that we get rid of all animated menus, animated drop-downs, that they should be illegal on the web because they don't really add anything.

This is NASA.gov which is still doing the same thing where you have the drop-downs. If you go fast enough, I've been able to get three drop-downs open at the same time. If you can beat me and get four, send me an e-mail.

This one is really weird. This is like, Barnes & Noble's and when you get the actual pop-up, you get this live box effect. But if you'll notice in a second . . . and you also get the hovering cover problem where you cover a thing up. It's not a carousel. It's actually a conveyor belt. It moves at a certain speed. You can't speed it up. You can only change the direction of it. You can only stop it.

It's kind of like watching the old Lucy episode in slow motion. Nothing actually falls off the conveyor belt though in this case. This is how you can break that delicate balance. These are a bunch of anti-patterns.

I think I'll just stop it there and, maybe discuss a little bit about Netflix. And then, we can open up for questions.

When I went to Netflix, one of the reasons I went there was I was always an Alan Cooper kind of fan, not totally an Alan Cooper scamp. I don't believe that you can design just totally by Fiat, but I liked his approach on patterns and principles. I knew that going to Netflix would be a great balance for me because of the strong AB testing culture and the quantitative. I went there with that in mind, and I did learn a lot about that.

What was interesting to me about Netflix was just seeing how simple and dumb we had to make it. This is just typical of a large site that has a ton of video games, yes. That's all right. This is when you think: Why did I make that ring tone that? That's when you have that thought.

At Netflix it was like, some of the things we learned in the last few years, especially, the site still has got some really bad design elements. Donkey teeth, we call them the tab minis. Everybody on the design team calls them donkey teeth because we wanted to change those tabs since the day I got there, and it just seems to get prioritized down and never quite makes it. You know what I'm talking about? Those little yellow tabs? If you don't, go check the Netflix's site. You'll see why we call them donkey teeth.

What really moved the needle the most was getting rid of text on the page. We used to have synopses around each one of the movies, making box shots large, making things more visual. Putting transparency in the site was a huge thing. What often happened is we would make a change, AB testing- wise. From an AB testing perspective, we might create six or eight variations on an experience. And then, I guess the control.

What we'd see at the end of that after a month or two months, whatever, it said: Okay, well, it lifted queue ads, which is consumption adding to the queue, adding DVDs or adding movies to the queue or watching something. It raised consumption, but it was only do it on this page we did it on, not overall.

Well, you've got, it's a local lift and yeah, that's not bad. It's made this page probably better, but overall it didn't really help the site. And that's actually pretty common, when you're in that real down the tail trying to find tune stuff. You'll find that out that people only have so much appetite and how do you actually increase their appetite to consume more. Most people only have so many hours a week. They're not going to consume more, so there's only so much you can do for that.

We tried some experiments that were interesting around. We tried to do a few different things. Let me just pop out of this and pop down to the bottom. It helps to see a few things. Let's see, probably this guy here. This was rate and replace. If you say "not interested" or you rate something, it actually takes and puts another movie in place in the spot.

We knew we wanted to make the site more interactive because it had been more . . . it had interactivity around the little pop-up and other stuff but not really that interactive around recommendations. We want to make recommendations much more dynamic. So, we added this and put it in test, and what we'd do at Netflix a lot is make sure that we put a lot of new members through it. Because there are a lot of new members signing up all the time, you could have a pretty good flow of new members going through an experience.

The goal was to raise taste input without hurting consumption, taste input being rating, saying that you like something. Say it's more than a four star rating, in this case, star rating. By raising taste input, what happens is you can feed more information into the recommendation engine and the recommendation engine can then give you smarter choices for movies, when then can raise consumption over time. Which then if you raise consumption, consumption is a proxy for retention, retention being whether you stay as a member or not. You'll stay as a member if you're finding value out of the service. That's the whole value proposition in Netflix in a nutshell.

So, new members, so, what happened? Do you think that was good for taste input? Do you think it was okay for it? Good? Bad? It was good. It was actually really good. It raised taste input. However, retention dropped. People canceled membership, and consumption dropped. Now, they didn't drop it because we had this feature, that they saw the feature and go, I hate Netflix, I'm going to quit.

It's this whole thing of there's only a certain amount of time that a user has to do something, to understand the value of a service. So, a new member coming in, they have a two week trial. They don't know much about Netflix. They get in there and go, "Oh, I rate a movie. Oh, it's cool." They sit there and do that, and then lunchtime's over or the coffee break's over or it's time to eat, and they go away and they get distracted.

Their lives don't revolve around Netflix. It's just a site that they visited. They're trying the service out. It doesn't cost them anyway. It's a free trial, and then two weeks run up and they go, "Well, I didn't get much value out of that. I could go rate some more movies." They don't say that, but that's kind of what's happening, and they end up canceling.

So, what we ended up having to do was actually put it in a different place in the site and then an actual rate movie site. We knew that it was effective. It worked. It did increase taste input, but it didn't actually help consumption and, therefore, retention; members finding value. By putting this same experience over in a different area where existing members got to, hindsight is like, duh. It's pretty obvious.

That's the way that all of the testing ends up being. It's totally obvious once you . . . hopefully, it is. It's not always, but hopefully it is. This is really one of the few things that raised taste input, raised consumption and increased retention, which was actually pretty amazing.

The other big one was transparency, just putting text near a recommendation or rogue recommendations that actually said, "Here's why we're recommending this movie based on your interest in this or whatever." And what that did was it was our theory, our hypothesis when we came up with that, that if we can increase trust, then people will take the recommendation as something they'll really follow. It's just the trusted friend.

One thing Netflix never did well, has never done well, and probably will never do well is community, social. This is the algorithmic approach. You can think of Netflix as being the Google of movies because it's really the engine, and everything else is really fueled around five or six core sets of algorithms that figure out which movies to watch. What feels like this human light to touch is just a cold hard set of servers in the Cloud now moving out to Amazon Services. So, I hate to pop your bubble but no. A lot of love is put into that.

We actually have Todd; I say we. They have Todd Yellin who's the Product Manager. He used to be a film maker. He's the head of product management, and there are some great product managers there, and Todd has just a total love for films, and a number of people there have a huge love of films. They really look at what's being recommended and debug that and understand that.

The other big thing that Netflix has been working on that really moved the needle a lot is similar to what Pandora did with the Music Genome project. If you're familiar with Pandora, they have people who actually take music apart and it's the constituent parts in essence tag music.

Netflix started this totally algorithmic based on like similar movies and some core kind of values. Then, they began to hire people down in Beverly Hills to tag movies, micro tag movies. By micro tagging, that's why, if you go to the site, you'll get pieces from the 18th century that have a quirky Indian for people like you because all of the micro tags get stuck together. Now, you want to be very careful and not have erotic movies for three year olds. Hopefully, it's not.

We actually had to debug stuff like that because things would get put together in weird ways. When things are tagged and related, you get some really odd combination, so we had to go through and make sure that things wouldn't come up that would be kind of embarrassing, nothing quite that. That was our extreme example, just to call it out.

So, being able to micro tag is a form of personalization because now we can figure out based on the movies you've watched, we can come up with this micro genre or this alternate genre, this personalized genre. We did that. We got lots and lots of tweets of people saying, "Wow, Netflix really knows me" and they would quote whatever their home page was showing in a row from a micro tag perspective. Again, just a cold hearted algorithm.

A lot of love was put into tagging those movies to get that information. That was another strong thing. In the last three years I think those are some of the main drivers that probably did the most on the site.

I'll stop talking now.

Moderator : We've got about ten minutes yet.

Bill: We started five minutes late. Yeah. So, I want to open up for questions.

Moderator : Designing by analytics versus principles.

Bill: Yeah. What happened at Netflix was it was totally owned, the downhill skier shaving off the micro seconds or the milliseconds. That's the whole . . . you're trying to get faster and faster versus an X Games kind of play. As soon as the TV came into play, that changed the game because you now had a totally different space. That opened the door for us to bring in a lot more qualitative research, a lot most research studies, a lot more of prototyping.

They didn't see any value in prototyping. They didn't see any value in much usability. It was only a few years ago when Gibb had come in to head up product management that they actually even did any usability research under the marketing team. Up to that point, it was the engineers that knew that they didn't understand design, so they leaned on AB testing to get them really. They had some good hunches, and they had some people who came on with some good hunches.

So, the balance between the two started to happen at Netflix by having a totally different space that you don't know what to do with to balance that. I still think they lean more . . . They're very strong on the AB testing world because a lot of people think Netflix is better than it is. That's what I would always say because they get a movie recommended to them. They like the movie. They have a great experience with the movie, and so they transfer that love back to the site.

If you can create a service like that, it's golden because all you've really got to do is get out of the way. But isn't that really the truth, that great experiences are really about getting out of the way. Another question.

Audience Member: Question on the AB stuff when you start looking at things.

Bill: Yeah.

Audience Member: What's the cycle between knowing the initial data and seeing the secondary data, say, the first data doesn't make sense. Is that something that you're looking for?

Bill: Yeah. We would look for it usually within a few weeks we'd have enough data to know if a test was kind of giving us weird results or not.

Audience Member: Were you set up to look for secondary sets of data? Were you thinking, oh, we've got this to test and this to test and this to test after we push this into a stream of testing?

Bill: Yeah. Usually, what happens is there was a hypothesis and there were a number of different approaches to test that hypothesis, like transparency will create trust and trust will create people to accept recommendations and watch stuff they like. So, that found its form in lots of different ways. If that test didn't work, we still had the hypothesis we were trying to work out. And so, we would try something... Usually, we had several things in mind to try, and they were queued up. Is that what you're getting at?

Audience Member: Yeah. And what type of data were you using to support your decision making? You've got raw data and analytics.

Bill: You're talking about the metrics itself?

Audience Member: Yeah. Metrics, customer service. What did you . . .

Bill: So, on the member side, hardly anything around engagement or impressions or anything you typically know of. It was all around consumption, consumption being adding movies to the queue or watching a movie or adding an instant movie to the queue. And there was a formula around trying to tease out what that meant, consumption.

Consumption is a proxy for retention because if you look at Netflix's structure, it's around acquisition on the non- member side, and it's around retention on the member side. And you've got to keep your cost down. So, those are the main drivers. How you measure retention, you measure through consumption.

The old shipping model, you couldn't wait for a ship to happen, and you couldn't detect if somebody played it in a DVD player. So, you had to back up stream to get to the queue add, and that was the data, that was the metrics we looked at.

Audience Member: How long did you have that rigid structure of that? When you started a business, you don't know those.

Bill: Well, Netflix is 10 years old, 11 years old. So, it was before me.

Audience Member: Okay.

Bill: I don't know when they actually set those in place because initially they weren't even a subscription model. They were "rent one at a time" model at the very beginning.

Audience Member: You mentioned that you kind of keep the worlds separate, but how often did the customer service, the customer service component, did it actually . . .

Bill: So, when you looked on the customer side, the support, we looked at those issues, and those would feed in. At the end of the day you really couldn't do harm. You were not allowed to do harm to retention. That was something and churn, those sort of things. Those were the hard numbers that you just didn't want to violate.

So, you could do something to satisfy customer support needs, but it had to be something that didn't also detract from the main experience of watching movies and enjoying movies.

Audience Member: So, you were willing to increase customer service costs if it increased retention.

Bill: Slightly. Yes. The thing is though Netflix, even this last year, was number one or number two right behind Amazon. They've been number one for seven years in a row in customer service satisfaction. A lot of that just has to do with, I think, the move away from the phone, even though people thought that was silly, I mean, moving away from e-mail to phone because it put a personal touch to it, and e-mail is really kind of crappy to get an answer back in.

They were real careful about monitoring wait times. We published the wait time on the site, so whenever you were going to do a call, it says average wait time is one minute or one and a half minutes or two minutes. There was a little number there that you took and you could copy it and quote back to the person until we got a better system that made it a lot faster when you got online. That was something that we worked with Vikram on. That was some stuff that Vikram did, yeah, when he was there.

Moderator : Probably a couple more questions, a couple more questions.

Audience Member: Yeah. You talked about the dragging feet, the tabs.

Bill: Yeah. Yeah.

Audience Member: Were there any usability issues with there?

Bill: Not really. That's why we never really worried about it much except that we wanted to create a better visual style overall for the Netflix site. One of the things we had tension with, that we went back and forth on was how consistent to make everything because when you're AB testing, you're changing stuff all the time.

And so, one part of the site, like you noticed this is dark here and the other part's not. That was to call it out visually. At some point you end up with a Frankenstein site. And so, we actually had a lot of conversations about that. We didn't have any good answers to it, just that, but the tabs work. People get the tabs, and we were always like, actually removing tabs. We did tests where we got rid of most of the tabs. So, it was really more from a visual.

Also, you're just thinking about the TV and branding, going across the TV. If you don't have this donkey teeth tabbing, you could have something that creates the same kind of branding across so [inaudible @29:10]. Other questions.

Audience Member: Is there one more?

Audience Member: If there aren't any more questions, one thing that we always come across and this is just broad looking for your opinion, you used the word user expanse. It's pretty front loaded. What does that mean? I'm curious how much technical expertise you bring to the table with your decision making.

One of the things that we find is a lot of people get confused that people that have skills and can implement stuff versus direct and know the difference. And how much do you think user experience in your mind is about understanding the implementation of things as much as understanding the outside [inaudible @29:52]

Bill: You have to understand it, but you can't let it limit you. That's the trick, so as long as you don't get limited by it. Whether you're a designer and you start implementing something, you immediately lose the focus. You start worrying about what you've created, the love for it and all the constraints that you.

What I think is really important between UX and engineering, sound engineering especially, is a shared understanding, and that shared understanding can take lots of forms. It can be documentation if your teams are across the water, remote or whatever or it could be something live and living like a prototype or it could be hallway conversations.

So, it's important to get design literacy into the engineering team and engineering literacy into the design team. So, a lot of the work of getting the two teams to work together is vocabulary being common. That's one of the things that I enjoyed about the pattern library. To me, the pattern library was not so much about the practice itself.

The rules of the pattern library was the fact that Yahoo did it. It was the fact that it stated that you could have a vocabulary or that sound engineering could share and that it called attention to some things that people weren't noticing, that were happening on the web, and they needed to think about more carefully. That was my thoughts on it.

So, things like patterns, those are all good techniques to get teams to rally around and work together, but I look for hybrid folks. Usually, at least, some on the team have some hybrid skills. If they don't have hybrid skills, I work to try to figure out how they can get a deep appreciation for what the other side is doing.

I've seen over and over at companies where there was a breakdown. I remember at Travelocity there was a breakdown. I was talking to the teams, and the problem was the engineering team did it in JSP and Java, and the design team, part of the team was putting together HTML and CSS. And they'd throw it over the wall and then they told me and they cut it up. I said, "Well, what does cut it up mean?" And they couldn't explain it. I said, "There's your problem."

I said, "Okay, let's get the two teams together, and let's walk you through what they're doing. And let's see if there's some ways that we can cut this whole crazy process out, and they did and it made a difference.

So, that's the kind of stuff, I think, two teams working together.

Audience Member: If you had to recommend one last thing, that one thing, user experience response or interaction sites, what skill would you say, "Hey, go do that first or moving in that direction".

Bill: That's a tough one. I've got this design, Linz's project that I've been working on, which is basically taking other fields of study, and almost any other field of study that you can do to bring into the cross discipline is important. It also doesn't matter what it is as long as it's something that exercises a different part of the creative brain that you're bringing to the table, whether it's music, whether it's psychology or whatever.

In fact, if I said there was one, then everybody would look like that and you'd have a crappy team because everybody wouldn't bring different things to the table. I think it's really, really important to have that kind of cross pollination from different experiences. I find that to be really fun and invigorating when you've got a team of people who have totally different backgrounds.

My bias, I would say, is the skill set, extract things because of some of the software engineering background. That's why you see patterns, you see my book. It's a reductionist approach. I take a very reductionist approach to design, which is not always the best way to do it, but I'm a captive of my thinking, and I catch myself going down that way. That's my thing.

Moderator : I would like to thank you for coming.

Bill: Yeah.

Moderator : It's been awesome to have you.

Bill: It was good to be here.

Learn from Past Influences