Thursday, May 13, 2010

Data Visualization Rational

My Data Visualization was designed to help the keen surfer get information fast. This idea came to me because I enjoy a surf when I get the chance but the only problem is that the TV’s weather reports are too broad and vague for my small town, so I would have to use the car to travel down to the beach and look at the surf my self. This not only wastes time but it also increased my carbon footprint.
My Data Visualization has the coast line from Foresters Beach to Avoca my main surfing spots. Next to each of the 5 beaches that I regularly surf at I have illustrations of waves the larger the picture of the wave the lager the waves will be down at the beach. I also have a wind, temperature, weather, tide and ocean temp Data visualizations to help a surfer in need of some quick information.
I have divided the waves as follows: 0-1ft, 2-3ft, 4-5ft, 6-7ft and 8+ft. I have divided the temperature data similarly 25+ Degree Celsius will inform the viewer by being a red thermometer. A 25-15 Degree Day will be yellow and anything lower than 15 Degrees Celsius will make the thermometer blue. Like the 2 before it my wind data is divided as follows: 0-5kts, 5-10kts, 10-15kts and 15kts+. To measure water temperature I used swimming costumes. A pair of board shorts indicates that the water is 23 Degrees Celsius or more a spring suit suggests that the water is 20 -23 Degree Celsius and the steamer symbolizes that the water is less than 20 degrees Celsius.
I also have data visualizations that don’t use numbers such as my tide indicator which the letters ‘TIDE’ fill up like a glass getting filled with water to symbolize how high or low the tide is at the beach. To show the weather I have stolen the weather report way of telling weather by drawing a picture of the sun for fine, a cloud for cloudy, a rain cloud for rain and a thunder cloud for storms.
I have designed a time line to predict the conditions during sunlight. The time line also shows the times for the sunrise and the sunsets once the mouse is over the illustrations at the start and the end of the time line. I only focused on the weather from sun rise to sunset because once there is no light at the beach every surfer knows that who ever is still in the water is shark bait. I realize that as winter comes by that the days get shorter and shorter so I have my time line along the top of the Twitter API. The very ends of the time line will be from the times 4am to 10pm. There is no way that there would be sun light before 4am and after 10pm at those beaches assuring that there won’t be a crash due to html mess up. The sunrise/set will place them selves on the time line once that data has been received.
All of my Data Visualizations once they have a mouse over them will emit a white glow around the illustration and the exact data or the most accurate data for the viewer underneath the picture.
Every piece of data that is need to create this visualization can be found in http://www.bom.gov.au/ or more commonly know as the website for The Bureau of Meteorology.
I imagined this data Visualization as an app on a smart phone so at the very bottom of the data visualization I have a Twitter API. What this will allow the Data Visualization to do is do something that no machine to date can do. Because the sand at the beach is constantly changing the waves at the beach are going to be always different. It is impossible to try and retrieve data or predict the type of wave at the beach. The Twitter API allows people either close to the beach or driving passed the beach to comment on the waves at that particular beach and other people either more inland or just too lazy to look them selves can see what the waves are through the Twitter API. This way a surfer can know exactly what type of waves will be crashing on the beach. Whether it be flat messy or dumpers to beautiful purling barrels you will know the conditions.
I hope one day that I will be able to wake up and see the conditions for my local beaches no matter where I am in the world.

Tuesday, May 11, 2010

Week 13

This week I was busily drawing all the pictures for my data visualization. So far I have drawn:
A personified thermometer to show the temperature.
Board shorts, spring suit and a steamer indicating the water temperature
A sun, cloud, rainy cloud and a thunder cloud indicating the weather.
Different pictures of a windsock indicating wind strength.
5 types of waves indicating the size of the wave.
The coastline of the central coast.
A tide indicator visually displaying the tide.
A compass to show the surfer his Barings.
A time line that will allow the user to slide up and down to show the varying conditions at the beach within the next 12 hours.
A reference picture showing the times for the sunrise and sunset for a quick reference guide for the keen surfer that wants to be on the first wave but not be eaten by a shark.
I was also thinking of having one of those stock market like banners at the bottom of the page where I would create a twitter API and have surfers twitter the conditions on their local break whether its sloppy dumping or even barreling tubes.
All that is left to do is combine all my pictures into one and show my data visualization to the world.

Monday, May 3, 2010

Week 12

I have finally realized what I want to do for my data visualization. I plan on making a data visualization for the surfers of The Central Coast of NSW. My plan starts with the coast line of my local surfing area. Once i have my landscape I want to have symbols on my map representing different waves, weather, wind, tide and water temp. For example I have board shorts a spring suit and a steamer. These will represent the different water temperatures 15 Degrees and less and a steamer will appear on the map, from 15 Degrees to 20 Degrees a spring suit will appear and anything above 20 Degrees board shorts will appear. I also want to have an adjustable time line predicting the different changes that will come about in the next 12 hours. I figured that the data will be easily accessed i.e. the bureau of meteorology, any weather website would have this basic information. I will hand draw pictures for my data visualization and plan on getting help from ben on how to present my final copy of my work.

Thursday, April 22, 2010

week 11

Nearing the end of the semester brings about 2 things the cold and assignments. The last assignment for this semester is a data visualization concept. I still don’t have a idea but this week in our tutorials I was able to look at a few cool ideas. Michael had put a lot of links on delicious to look at. Towards the end of the tutorial everyone in the class had to analyze and present a data visualization to the class. I had a Crayola data visualization time line. It showed how the Crayola colours had increased as the years went on up to the present. In 1901 there were only 8 colours and by 2010 there are 120. the graph shows that the colours increase on average 2.56% a year. It also shows that the colours double every 28 years. If this trend keeps going then my kids will have 330 crayons to draw with in the year 2050. Hopefully I come up with a sweet idea soon.

Sunday, April 18, 2010

week 10

This week in the tutorial I had to take part in an exercise that made us get Into small groups, come up with an idea for a web site or online service that makes use of our collective intelligence. Decision markets, collaborative filtering, voting, you name it. Explain how using collective intelligence/the wisdom of crowds is different to, or better than, doing it in a centralised, controlled fashion. What problems might come up with building the service? What our group came up in the 45min that we had was an application for a smart phone or ipod. This application would act like a portable suggestion device. It would work like Amazon in that it would suggest items on what you have already purchased. This app would make shopping easier and more efficient. It would also be able to show you where things were that were of interest using geo tagged items. Each of the geo tagged items would have a picture of the item and the variables in stock with prices and suggest things that would go well with the item. It would also share information with your bank account so it would suggest items within your price range. The app would be completely free as it uses advertisements during suggestions. It would also have a memory of the average time that you eat so during that time adds for food companies would pop up.

Sunday, April 11, 2010

Week 9

This week we started discussing web 2.0. What I took out of the lecture is that web 2.0 is almost a computer brain. It uses a thing called ajax. This system does a number of things. For example if u were to buy a book off amazon it remembers what u bought and suggest other titles that u might like. This is done by looking at people who have also purchased the same book and what they have gone on to buy. It also searches the same genre of book as well. The ajax system also makes it possible to be on the same page and not have to confirm and upload another web page every time you put a piece of information into it. I was also told of the long tail. It explains that when a thing on the web is popular generally it will follow the long tail graph. That is that at graph resembles an exponential graph a large increase at the start and a dragged out tail hence the name the long tail. But the greatest thing that i learnt that lecture was google trends it solves so many arguments its the greatest.

Thursday, March 25, 2010

Website Rational

The premise of my website is to thank The Rotary Youth Exchange Program. In order to do this, I decided to show viewers pictures of my exchange that show the great memories I had. I created an online exhibition by using my flickr photos and dream weaver. I also wanted anyone that was viewing my site to know where I took the photo.
My heading was created in Photoshop. I took photos from my collection that had an American flag and used them to make a heading. I also used Photoshop to create the ears on either side of the map.
I wanted the information about exchange to be readily accessible to site viewers, so I incorporated a link to the International Rotary Youth Exchange Program website through the words “The Rotary Youth Exchange Program.”
I used css to stylize my webpage making the background black and the text white.
I then uploaded 190 photos onto flickr and gave each of them a heading and a geotag. I also wrote descriptions for some of the photos that required an explanation.
My first approach was very simple: I had all the photos tagged with “tatsuyarotaryyouthexchangetoamerica,” A tag that no one else in the world would have. Then I made a simple yahoo pipe that had a flickr attachment. I asked for 190 photos of “tatsuyarotaryyouthexchangetoamerica,” but none of my photos came up.
On my second attempt, I asked my tutor Ben to help me with my dilemma. He found a more complicated pipe that required the rss feed to my facebook set of photos. The pipe started off with a fetch feed that required my facebook to set rss, which led to a filter, which, in turn, led to a location extractor. When out-putted, the photos came up as a list but it only showed the last 20 of the 190 photos that I had uploaded. When I asked for the photos on the map, one photo came up and placed it in the middle of Australia, even though its geotag should have placed it in Los Angeles Airport.
I also asked for Michael’s help but even he was having trouble with it. I looked through various yahoo pipes and multiple blogs that had stated that they had the solution to my problem but they didn’t. I went back to Ben’s yahoo pipe and started investigating why only 20 photos were coming up. It was flickr that was creating the problem was that the photostream only let 20 photos accessible at a time unless I upgraded to flickr pro. I then made sets of 20 and added each rss feed to the fetch feed. Once I put that through around 15 of the photos came up in there right geotagged positions. Just as I though everything was going well, when I took the embedded code and placed it in my website only 6 photos came up on the map.
After a lot of cursing and pulling of the hair I have given up on the flickr+google maps mashup. I really wished it had worked. It sort of did in the end. I just don’t get how I lost close to 180 photos in a few steps. If anyone wants to tell me what went wrong please feel free to show me how to fix my yahoo pipe http://pipes.yahoo.com/pipes/pipe.info?_id=951d7738e397daff5aea9602124cba07. Thank you.

Friday, March 19, 2010

Week 6

Ok week 6 and my page is looking good I’ve used yahoo pipes to create a google maps and flicker API. I’ve uploaded all the pictures of my exchange to america to flicker. I’ve given each of the photos a location so when my page shows the map of america it will have photos of places that I travelled. Each photo also has a description to explain what is going on in the photo. At the moment I’m having trouble finding the photos that I have uploaded I think it has something to do with the tags. I also don’t want my map to look like it was just place there I want it to flow with the rest of the page so I was thinking of using illustrator to draw pictures either side of the map to make it look professional. Hopefully ill have everything sorted out by the due date.

Monday, March 15, 2010

Week 5

This week I finished with all the text that I want in my web page and now I am sorting through the pictures that i want to use. As planed I will do a web page on my exchange and the memories and life long friends that I have made. What I still have to do by week 7 is put this text and images into html. Once I have accomplished that I will apply CSS to it and stylize my page to make it look less ghetto. Once less ghetto I will further pimp out my web page by applying mixups and API’s. These will allow me to used shared information on the net and apply them on my page without breaking any copyright rules.

Wednesday, March 10, 2010

Week 4

One thing that I have learned about this course is that it moves very quickly. Luckily this week was Canberra day and no lectures. I’m going to use next week to study up on API, div and ID’s. In the last lecture I was told about div and id. I was confused about them both. What I think a div does is separate a selected group using the div tags and it creates a box where you can use css without effecting the rest of the page. Id tags I think can let you style a lot of boxes using just one css id tag. Finally I was told about API which means you can uses other websites html and manipulate it to create your own web page using the other web pages uploads like twitter or flickr. If I have the wrong idea can some one lead me down the right track.

Thursday, February 25, 2010

Week 3

Its week 3 and I feel like O week was yesterday, time flies in uni. The further I get in to the semester the more the work load seems to multiply. In networked media production this week I was told that I have until week 7 to make my own website, I felt like puking the very moment michael said it. But the more the lecture went on the less sick I felt and the more pumped I was to show the world what I could make with html and css. I want to create a page that shows the amazing memories and life long friends that I made during my exchange to Exeter, PA in The USA. I want it to be used and seen as a thank you page to The Rotary Youth Exchange Program. I also want it to be shown as an example of an exchange that they can show any one that is interested in becoming an Exchange student.

Friday, February 19, 2010

My first couple weeks

In the first lecture I was told that I was going to have to do a blog for my Multi Media Production class. The first thing that came to mind when Michael Honey my lecturer said blog was eharmony. Like eharmony I thought the only people who blogged were single desperate old people. Apparently all ages do it and you don’t just write about what you had for dinner or tell the world that you like long walks on the beach, you can share your opinions and discuss view points to the rest of the world.

My first 2 weeks at Canberra university have gone by quicker than you can say http://yamasakitatsuya.blogspot.com/. Im my first lecture I was shown the course outline and how the internet works by the representation of ants. I’m not that much of a tech wizz so Michaels interpretation of the internet through ants helped me out a lot. I was also told that the internet has many faults and that it was thought of well before 1991 by a man named Vannevar Bush in 1945.

I have just finished my first full week at University. In week I was informed of how the internet was affecting other industries, different ways that plagiarism can be stopped and different types of blogs. The course gets pretty into the technical side of things and when I first entered the class it felt a little intimidating with words such as band width and disintermediation thrown at you but if Michael keeps describing things as clearly as he does I think I will be alright.

Wednesday, February 17, 2010