Psychedelic Media

Thursday, June 11, 2009

Reflective Post

Blog Content
During the sixteen weeks of posting I tried to keep content very broad, coverng all aspects of networked media production. I took careful measures to ensure all posts reflected some personal insight relating to the issue at hand. I avoided annotating information that had already been established, such as lecture notes and wikie's. Instead I used these sources among others for ideas as posts, once I had some direction I would further research a topic so that I could present a non-bias opinion/angle on that particular subject. I also tried to avoid using information from one primary source such as wikipedia, I wanted to demonstrate an overall understanding that can only be achieved by looking at different sources, i.e bigger picture.

Audience
My primary audience are people who are interested in network media practices. Particularly people studying/working in that field. I found that posting my opinion of other colleges posts such as the post Media Royals, sparked interesting controversial debate that resulted in additional posts in resolving that issue. Not only is this interesting for the reader but it also encourage me to further investigate my own understanding of network media practices.

Connectivity
Other all my Blog did not receive much traffic despite using a range of web advertisement services, one of which being technorati. I could have focused more on following other colleges and reflecting on their content instead of basing my posts around the latest media issues, or even balancing the two. The most popular posts was content that challenged/contradicted an opinion of a college, this is reflected in the number comments for a particular post. Other traffic sources include spam from people promoting their services through comments.

Web Services
Blogspot did provide a number of useful services that helped maintain, enhance and assist in writing posts. The Blog Archive located on the top right of the page neatly arranged posts in chronological order. This helps viewers to navigate the blog's infrastructure, selecting particular posts that is of interest. Further more, the blog role just below the blog archive is a great feature that allowed me to add relevant links to other blogs that I found interesting. What I thought was neat, is that Blogspot provided a service that allowed other bloggers to follow me, likewise I could follow them. Lastly Blogspot provided a feature that allowed me to past HTML/Java-Script into a gadget that I could then display in my side bare. In a post called PRESENTATION, I touched on data visualisation and used a word clowd generating web service provided by http://www.wordle.net/create to generate the underlying HTML code for the discussion of that post.

Stem Cell Research

A Short Synopsis
I was browsing the tastiest book marks on the web, aka D.E.L.I.C.I.O.U.S, and stumbled across an article on how a patient had recovered their eyesight through the use of stem cell treat
ment. One question kept circling my mind; to what extent
can man play God? Before posting uneducated conjectures of this issue, I decided to play it safe and do some background research. If I haven’t caught your attention so far, read on....

Embryonic Cells
Note, that not all stem cell research involve creating, using and destroying of human embryos. Although in recent years the media has had a strong outlook on the topic of embryonic stem cell research. But why use embryonic cells and not other cell types? The answer lays in the embryonic stem cells maturity and their potency. Unlike other cells, embryonic stem cells are subjected to differentiating to any other type of cells needed by the human body. This offers medical treatment for a range of conditions, such as extensive repair of human tissue, degeneration conditions and genetic diseases. The drawback is that to be able to use embryonic stem cell treatment, embryonic cells need to be first generated through in vitro fertility (IVF). IVF tends to generate allot of unused embryonic stem cells that are discarded. This raises many sensitive issues. But first let’s look at some alternative solutions.

Alternative Solutions
With concerns of sparking further controversial debate over the morality and ethical attitude towards stem cell research, researchers are working on techniques of isolating stem cells that are as potent as embryonic stem cells that do not require a human embryo. Researchers believe that human skin cells can be coaxed to "de-differentiate"and revert to an embryonic state, although more studies are needed to be carried out. It has been found that the fluid surrounding the fetus contains potent stem cells. Lastly researchers are working on techniques of reprogramming adult stem cells to behave differently, the trouble with skin these cells is that they are subjected to DNA abnormalities caused by toxins and sunlight. Although researchers are working on better techniques it may be some time until they are perfected.

Ethical Issues
There are many opinions on whether Embryonic debate is ethical, personally I think it should not be allowed as the destruction of an embryo is in fact killing the potential of a living human. Some believe embryos are not equivalent to human life while they are still incapable of surviving outside of the womb. Others question that if abortions are legal why can't they be used for medical research? These and other questions have no one solution and may have a varying opinion form person to person. I encourage you to broaden your knowledge and make an educated opinion on your understanding of stem cell research, until next time :p

References

Friday, June 5, 2009

Web 2.0 is ________

Uninformed Web 2.0 Members
What exactly is Web 2.0? Allot of people are oblivious to the fact that they are participating members of the Web 2.0 era. But to describe Web 2.0 in one paragraph would only scratch the surface. So I an offer an alternative solution... I will highlight some of the key features of Web 2.0 and provide resources for further investigation.

Web 2.0 is YOU!!!
How do you feel knowing that your apart of something but don't quite understand what it is? Lets take a
step back. Web 1.0 allow users to log on to the internet and consume media. In the dawn of the internet
pages were static and information were generally provided by a centralised source. Similar to how
traditional media distrabutes content using a NewsPaper.

References

Wednesday, June 3, 2009

Data Visualisation

NOTE: This post relates to project two, a copy of the visualisation can be obtained from here.

Introduction
The idea of these graphs is to forecast the type of contextual information within a given book. A user is able to discern the dominating theme of a book by analysing the growth and decline of word frequencies. Furthermore, the graphs present interesting information on many layers. One of which is the shift in the frequencies between the Old and New Testament. From a historical context, this suggests that the Old and New Testament were written in different periods of history which is defined by the religious beliefs and attitudes of society at that time. The following subheadings go on to discuss the challenges and processes that were involved in designing the graphs and the reasons behind the different decisions.

Challenges
Choosing A Project
The initial challenge was to define the scope of the project. This required me to assess the specifications and benefits of each project option. Among many of the specification, time management was a leading factor. Not thoroughly analysing what each project required and the associated skill set needed to fulfill those requirements really had an impact on the productivity and time management of the project. Initially, I was intending to choose the option that required me to create a geo-narrative using Google Maps. My idea was to develop a photographic tour of Canberra but due to transportation and technical constraints using Google Map's application programming interface (API), this was not achievable. As a result I was force to reassess the situation and consider a different option. Data visualisation appeared reasonable but not straight forward as I first assumed.

Gathering Data
The challenge in gathering data is that I had to ensure the data was accurate and reliable. For an observer, this would guarantee that information extracted from the visualisation was genuine. Secondly, I wanted to choose a data set that was unique and discrete. The type of visualisation I anticipated relied on data that wasn't continuously changing over time. I figured that data gathered algorithmically would eliminate accidental errors and the the need for human interaction. My first attempt for gathering data proved problematic. I intended to develop a program using the C language to analyse an electronic version of the Bible (King James version). However, I didn't take into account the functionalities needed before starting out. As a result, I ran into difficulties in maintaining code as new features were added. I resorted to an open source analysis tool called TextSTAT-2. The only drawback is that I had limited knowledge on how word frequencies were gathered and was obligated to rely on TextSTAT-2 judgment.

Visualizing Data
The initial challenge was to choose an effective visualisation tool. Having only limited knowledge using Excel, I decided to create the visualisation in MS-Paint. The drawback with using MS-Paint is that all the house-keeping is done by the user. I found that drawing the grid and calibrating the data along with connecting the lines to be very tedious and visually difficult to comprehend. After perfecting the graphs in MS-Paint I decided to use a web-based visualisation service called ManyEyes instead. ManyEyes, unlike MS-Paint, took care of the internal house-keeping and provided useful features that allowed users to interact with the data. Unfortunately, many attempts to upload my data set failed to be recognised despite massaging the data and working through detailed tutorials. With only one option left I resorted to using Excel. All the computations for normalising the data were done automatically and Excel had no trouble associating the data with its visualisation.

Processes
Choosing The Themes
The process of choosing a theme required looking at key words and determining if they related to each other. I looked through each book of the Bible and wrote out a list of key words for each particular theme. Then I did a rough analysis on the frequencies of the chosen words and omitted words that did not produce reliable data sets. I was particularly interested in data sets that gave a varying average in word frequencies throughout all the books. This decision was based on the fact that an observer could analyse the frequencies of different books and easily compare them.

Extracting The Frequencies
A soft copy of the Bible I used for this project can be obtained from here. Unfortunately TextSTAT-2 didn't allow analysis of multiple sources. This constraint required me to separate each of the books into different text files. Once partitioned, I had to open each book with TextSTAT-2 and individually query every word frequency. With 15 queries in each book and a total of 66 books this process did take some time. To eliminate potential inaccuracies I had to enter each word twice. Furthermore, I had to make the decision of whether or not I would accept variants of the same word, e.g. love, Love and love's. I decided to include the variants as it's just the grammar and not the definition that is changing.

Visualising The Data
After the data had been collected I needed to input it into Excel. A copy of this spreadsheet can be obtained from here. Before visualising I needed to ensure the data was normalised. This would allow users to compare relative word frequencies with out the frequencies being dependent on the size of a given book. This process involved taking the frequency of a chosen word, dividing by that book's length and multiplying it by the size of the largest book. Once normalised, it was matter of choosing a visualisation. Excel provided an excellent selection of visualisations and options for customising them. For this type of data set, the line graph was the most appropriate.

Analysis
The gradient of the lines between each book of each theme suggests a change in either content, message or mood. Although it is difficult to predict the author's exact intention or intended meaning without understanding the context in which the word is used. In saying this, the graphs represent valuable analytic information. The following conjectures are entirely based on my interpretation of these graphs which are likely to differ from someone else's.

The theme of Love and Will follow similar trends in the growth and decline of word frequencies. Both graphs present a smooth transition between each book with momentum only increasing for a small number of books. This suggests that the overall theme is consistent with emphasis only given to a select few books. On a different level, the frequency of words could imply that the authors of different books had similar insights. Comparing the theme of Love with Will and Anger illustrates that love is the dominant theme with the word “heaven” peaking at 1200 words in the book of John1.

The theme Anger unlike Love or Will present different analytic information. There are radical changes in word frequencies between all books. This could suggest authors at that time had varying interpretations of anger. Although from a biblical perspective this could imply God's guidance which would nicely work well with the key principles of the biblical teachings.

Cultural ideas are continually changing as society conforms to the social expectation placed upon them through certain historical events. In particular, the life, death and resurrection of Jesus and how it revolutionised people's interpretation of biblical teachings. Distinct shifts in frequencies between the Old and New Testament support this consensus. In conclusion, these graphs represent information in an easily interpreted format which otherwise would be difficult to extract a similar analysis by examining the raw data source.

Monday, June 1, 2009

Software Driven Media

Networked Media, as the name suggests involves distributing content over many networks. Networks such as the internet, CB's , radio stations, digital television and the list continues. However this would not be possible if it were not for the aid of software to instruct these devices. In particular scripting languages play a critical role. In fact, what your reading now is a series of tags interpreted by your browser that is displayed accordingly :p, not that you didn't already know. Scripting languages such as java script, ruby and perl have made it easy for people to create interactive and dynamic content. That is, embedded content that operates independently of the page itself. One example of this is an Internet Rely Chat applet IRC, but if you look around i'm sure you will find allot more.

references
Lecture notes

Wednesday, May 27, 2009

MashUps

The internet is radically changing, people are constantly thinking of new and innovative ideas that revolutionize how users interact with the web. Mashups are a great example, a user is able to combine data or functionality from two or more sources to create a single integrated application, thus creating a service that was not originally provided be either source. Society has realized that it's better to provide tools for users to work with then a fully featured framework for presenting information. Likewise, open source development are following the same trend. Although Microsoft owns the monopoly allot of people are now beginning to adopt Linux. Reason being is that Linux encourages users to have control over every aspect of their OS and even provide open source development packages for creation of home brewed applications. Similar to how web services provide API's to interface with their content. It follows:
  • simplicity = limitations
  • complexity = freedom
you choose....

References

Wednesday, May 20, 2009

Media V.S Reality

It's All Around Us
Ok, it's 1.14 pm and i've just come back from a lecture. Walking to my resident's I stumbled across a rock that trigured a train of thought. Got me thinking about the compossition of the rock, the attributes and properties in which that rock could be used for. Surprisingly enough, it's interesting how human's take pleasure in categorising objects according to their structure. After watching the Matrix the night before, I imagined the rock being represented as a finite set of mathemtaical formula's. In actual fact, all media is not fare from that truth. Abstract as it sounds, the same principles controlling the relation between all materilistic objects are similar principles that governs the relations between all intellectual objects.

The Intelectual Realm
Ok, so you have heard my oppinion but not entirely convinced? If you could record the precise actions of a person playing a video game. Then on another day, replicate those exact actions... in theory after a finite number of button presses your character would end of up at the same spot. Straight forward as it sounds, one may alter those these instructions but also give same result. Furthermore, the sum of two numbers may give a different result each time the operands are changes but the addition process is the same each time it's onlt the context that is different. A composer of a piece of media may want to challange the responder to think a certain way. The principle's used in the creation of that partiular piece of media is similar to other piece's of media that fits into that same gonre, e.g. advertisement. The composer may use different techniques to protray their message but the net effect is the same. Just like, the attributes and properties in which that rock could be used for.... :-)

My Pheosophical Understanding
May of mentioned this in a previous post: "I stongly believe that their is a finite amount of information assocated with every instance of intelect just the context that differs". In all my area's of disicipline I like to apply this logic and understanding how it fits into the blueprint of knowledge.

References