Press "Enter" to skip to content

Month: April 2013

Response to Virilio, Open Sky pt. 3

Reality - worst game ever

My final reading response of the semester! This time, we’re finishing off Virilio’s Open Sky with a summary of Part 3! The reason for the silly picture above is Virilio’s more serious prediction that “we are about to lose our statuses as eyewitnesses of tangible reality once and for all, to the benefit for technical substitutes… which will make us of the ‘visually challenged.'” This picture is making fun of the fact that a growing number of people spend a lot of their time playing online games, interacting with other people in virtual worlds and not in the real world. The idea that all of our interactions could take place through a screen someday is not too far off. Virilio already talks about how what we see is obscured by the TV screen and by the windows of the vehicles that get us from place to place, and warns of the dangers of indirect light as opposed to direct optics discussed in the earlier parts. It would seem to me as though we’re trying so hard to recreate reality that we’re disregarding the reality that we already have. Instead of experiencing the world ourselves, we’re sitting inside darkened rooms and staring at the screen, living in virtual worlds and neglecting all that’s outside. Google Street View is an example of the real world becoming a virtual world. You can go almost anywhere in the United States, on almost any major public road and see pictures of what’s outside without actually going there.

Virilio also talks about how some of these images are so lifelike that we get confused and have trouble distinguishing CG images from real life. Think of Apple’s Retina Display technology and the 326 PPI (pixels per inch) that it’s capable of displaying. Virilio talks about lasers beaming the images directly into your optic nerves, but massive PPI displays are in commercial use today, and they’re so detailed that you can’t even tell that there are pixels. With 4K resolution and the increasing power of GPUs to generate images, as well as the power of real image manipulation tools such as Photoshop and After Effects, we’re getting to the point where things can look absolutely lifelike but unless you see them for yourself you’ll never really know if they’re real or not.

Response to Virilio, Open Sky pt. 2

Paging Dr. Nanobot! In the future, tiny robots small enough to fit into our bloodstream could help to remove blood clots, treat diseased tissue, and maintain our bodily functions. Part man, part machine! Part 2 of Paul Virilio’s Open Sky starts out with a discussion of the transplantation revolution, the idea that machines will be assisting with functions inside our body as well as outside it. It’s already happening, as scientists have grown a kidney in the lab just this week. Soon, we may be able to replace all sorts of body parts just like upgrading our computers. Virilio predicts that in the future, we may be able to embed a machine into our bodies that will allow us to act at a distance like the DataGlove or the DataSuit. Google Glass is a close contender, allowing us to interact with the environment in a whole new way, but it isn’t directly wired into your optic nerve. Virilio says that there are three intervals that are used to define acting at a distance:

  1. Space: geometric development and control of the physical environment. Innovations such as the car, the train, and mounted animals such as the horse and the mule are examples of this.
  2. Time: control of the physical environment and the invention of communication tools. Letters, telephones, TV, radio.
  3. Light: instantaneous control of microprocessor environment. Today’s modern computers that rely on the speed of light to send their signals.

As a human society, we’ve got a pretty firm grasp on all three of these intervals, and as technology advances, we’re not only obliterating the concept of distance, the interval of space, but we’re also miniaturizing our technology as well. Virilio says that less is more in today’s society, and that few human interactions are required to do a lot of tasks that had once to be done manually. With a push of a button, we can lower our blinds, turn on a light, lock our doors, change the channel on a TV, and automate our entire house without getting up from our chairs. We’re now well on our way to automating the last remaining tasks we still need to do, injecting robots into our blood stream to maintain our body without us needing to do anything.

 

Response to Virilio, Open Sky pt. 1

This week, we read part one of Paul Virilio’s Open Sky. As pro-technology as all the other readings have been, this one is certainly different. Virilio doesn’t care much for the latest technology. Telepresence, the ability to appear in two separate places at once through teleconferencing technologies such as webcams and the Internet, is shortening the distance required to communicate. You can communicate with someone anywhere in the world without leaving your house, and because of this we don’t get out as much. Just as cars and planes have shortened the distance around town and around the world as far as physical transportation, the entire need to go anywhere has also disappeared. Someday, people may all just lay in bed and virtually project themselves into robots that deliver impulses to our nerves at a distance, while never needing to even move. The example that Virilio mentions is the Datasuit, developed by NASA scientists that allows you to feel, see, and hear just like you’re somewhere else at the same time. This telepresence is going to destroy us by obliterating distance according to Virilio. Without the need to wait, everything also needs to happen immediately, in real-time. The flow of information is overwhelming. Virilio talks at length about physics and how they relate to time, and talks about different types of optics, large-scale and small-scale optics, and the difference between them. In the future, he says, we’ll have a tele-existence and become terminal men and women. It reminds me a bit of Jonathan Mostow’s Surrogates (2009), which warns of the dangers of this type of technology in which a giant company controls all sorts of robots that interface with humans.

Code Snippet: Extracting a 24-hour time from a 12-hour time string in MySQL

The snippet:

HOUR ( STR_TO_DATE ("2:30pm" , "%h:%i%p") ) = 14

Why I needed it:

I’m currently working on a time and date filter for ClassGet. If you’ve ever wanted to find a morning class that meets only on Tuesdays, this tool is for you! The main problem, however, is that Furman’s course listings file has the start and end dates for classes in this format:

Start Time: "12:30pm", End Time: "1:20pm"

and my import script doesn’t convert these into a DATETIME format in SQL. I want to be able to look at courses from If I want to find a class starting from 12:00 to 14:00, I’ll need to do some conversion. On the front-end, I’m using a jQuery UI slider for my hour control that goes from 5 (5:00 am) to 21 (9:00 pm). Why anyone would have a class starting at 9:00 pm is beyond me, but hey, it could happen. I’m not going to worry about minutes, and I know that no class will ever be offered overnight so I’ll never need to worry about a start time being later than an end time. My script will search for classes that start from one hour to another, so I’ll need to convert the dates to match. You would think I could just do something like this:

SELECT * FROM classes WHERE `Start Time` BETWEEN 12 AND 15

But it doesn’t work like that. The dates are stored as strings, e.g. "2:30pm". We’re lucky, though… MySQL has an HOUR function that will solve that!

SELECT * FROM classes WHERE HOUR(`Start Time`) BETWEEN 12 AND 15

But still, no classes are showing up that start at 2:30. How come? As it turns out, HOUR("2:30pm") returns 2, not 14! How do we fix that? The answer lies in MySQL’s STR_TO_DATE function, which is the reverse of the DATE_FORMAT function. Now, take a look at the final version:

SELECT * FROM classes WHERE HOUR(STR_TO_DATE(`Start Time`, "%h:%i%p")) BETWEEN 12 AND 15

There we go! Now without doing any PHP or JavaScript or modifying the database structure, I was able to create a date filter for class data.

Response to Garrett, “User Experience and Why it Matters”

iphone-5

Apple’s iOS home screen doesn’t look a lot different than it did when the iPhone was first released 6 years ago. Since 2007, the only real improvements to the home screen itself have been Spotlight search, the ability to create folders, and the Newsstand app’s sliding bookshelf. While other mobile operating systems such as Android are constantly changing and offer widgets, app drawers, live wallpapers, voice search and the like on their launcher screens, launch day Android looked a lot different than Android Jellybean. Also, each Android device looks and acts just a bit different than each other. The HTC Thunderbolt for example uses HTC’s Sense interface while the Samsung Galaxy S II uses Samsung’s TouchWiz interface. My ASUS SL101 tablet has the stock Android Ice Cream Sandwich interface but has some ASUS customized settings. The point is, if you have Android, you aren’t necessarily having the same experience as another Android user. If you have an iPhone, iPad, or iPod Touch, you’re going to be getting relatively the same experience on any of these devices, except for screen size. Apple is really good at this consistency.

Today we read the first two chapters of Jesse James Garrett’s The Elements of User Experience, which introduced the concept of a User Experience and why it’s important. Garrett defines user experience as “the experience [a] product creates for the people who use it in the real world.” In Apple’s case, the iPhone is easy to use and intuitive to learn. Interacting with an iPhone is a pleasant experience, everything is smooth and flows well. You don’t have as much freedom to tinker with an iPhone as you could with an Android phone, and this means that it’s harder to break. Some parts of the iPhone experience are cryptic, such as the many uses of the home button, but in general Apple has created a finely-tuned user experience.

At it’s worst, a bad user experience can kill you. The Therac-25 radiation therapy machine is a notorious example of this, delivering fatal doses of radiation due to poorly designed software and a bad user interface. From the Wikipedia article:

The system noticed that something was wrong and halted the X-ray beam, but merely displayed the word “MALFUNCTION” followed by a number from 1 to 64. The user manual did not explain or even address the error codes, so the operator pressed the P key to override the warning and proceed anyway.

 While a badly designed website won’t give you radiation poisoning, it certainly won’t give you any business either. Garrett explained that if people have a bad experience on your website, they’re unlikely to return. If you can’t find what you’re looking for, you probably won’t stick around for long.

The placement of web content matters as much as the content itself. Underneath the content is the frame of the page, and under that is the way that all the pages are organized. Garrett refers to these elements as planes, and all of these planes add up into a well-designed web product, and the more abstract planes form the basis for the more concrete ones. As with building a house, you should start with a good foundation and build up. If you have a good strategy and scope when building a website, it’ll make designing the structure, skeleton, and surface of the site a lot easier to build.

Response to Reddish, “Letting go of the Words: Writing Web Content that Works”

This week we’re taking a temporary break from video and moving on to websites! Our reading for today is Ginny Redish’s “Letting Go of the Words – Writing Web Content that Works”, and the first thing that Redish mentions in chapter 2 is that “understanding your audiences and what they need is critical to deciding what to write,” followed by a discussion about audiences. When writing for the web you need to understand who your audiences are and why they’ll arrive at your page. There are a lot of people who will visit your site, and it’s best to try and categorize and target different groups specifically.

Best Buy has used this type of profiling to create personas for each of their different customer groups. In the graphic above, Best Buy lists major characteristics of “Maria,” a middle-class mom that only goes to Best Buy when others in her family force her to go. The idea being that employees can better serve someone’s needs if they know a bit of background about them. Best Buy probably created their list of personas by conducting interviews and doing research to find out what type of people visited their stores. Redish recommends that you do that same when creating web content by observing and talking to users of the site, getting feedback as they go along.

Once you’ve formed personas, you should keep them in mind when designing your site. Constantly ask yourself whether certain personas will be able to find what they need or accomplish what they need to do on your site. Sometimes when you have two vastly different audiences it’s best to have two different websites rather than writing for everyone at once. Take a look at the difference between Microsoft’s Windows 8 site, their Xbox site, and their MSDN site. All of the websites feature Microsoft products, and all of them have the same general goal: to get you to spend money on Microsoft stuff. Each website has a different audience in mind, however. The Xbox website envisions a gaming audience that is there to have fun, while MSDN is all about software development, and therefore is more serious and professional. The Windows 8 website is targeted mainly towards home and small business users, but Microsoft also has a Windows 8 Enterprise website targeted towards those working in IT and large corporations.

If you understand your audience, you’ll know what they want, and if you know what they want you can design your website to work like they do. Take a look at these comics and you’ll see that a lot of websites miss the bus in terms of giving people what they want.