Monday, May 12, 2008
Spring/Webwork/Tomcat/Maven?
Maven provides a lot of heavy lifting capabilities... It lets you take this one file, pom.xml, and via its magic, all external code required to build and deploy your small module is automatically lifted from the Gnizr servers, your code is automatically compiled, and any files you generate are overlaid on top of the delivered code and automagically built into a war file (the standard for deploying web services). And I know from personal experience, manually compiling Java stuff/Tomcat stuff and trying to build a war or jar archive can be pretty daunting if you are trying to do things manually.
Webwork gives us a nice Model-View-Controller implementation to work with, supporting different languages for the view aspect including JSP and FreeMarker Templates, and tying into Spring for dealing with Singleton/DAO objects and friends.
Spring makes it nice and easy to pass common objects into other objects, and control initialization of different things magically via XML, and with Maven makes it easy to extend the existing spring xml w/ new xml.
For a simple project, Spring+Webwork would probably be overkill, but for a system designed to evolve over time and be used and edited by many users, I think it (or something similar) would be essential for organizing things in a sane manner. Maven on the other hand would most likely be useful in projects of any size.
Subversion: Necessity or Annoyance?
Also, when more than 1 person is working on the project on separate accounts and/or computer systems, it allows users to make changes to the code, commit those changes and have those changes show up on others copies of the code (without having to ever manually worry about manually merging changes together). And in complex projects, even if you are the only one working on the project, having the ability to revert changes is key (also automatically having your code backed up on Google's servers is definitely an advantage :-).
So... Agree? Disagree? Unsure? DISCUSS!!!
Sunday, May 11, 2008
Web 2.0 at the Department of Defense
The Department of Defense's Defense Intelligence Agency has been attempting to use Web2.0 technologies, starting with a wiki, since 2004. They are attempting to leverage wikis, mashups, blogs and RSS feeds to assist their analysts.
This is a departure from the standard use of technology in government where upgrading hardware and software could go for years and the latest and greatest web technologies are shunned for both security reasons as well as misunderstanding of their uses.
While we won't see these technologies in all levels of government for valid reasons, in certain aspects of the government we could see these technologies used more and more.
Add some Wesabe to your Mint
Wesabe takes anonymized purchase data to manage your finances better and has recently launched a system to help you purchase the same items at other locations. The system seems like it would be a nice mix to go along with Mint.com in an overall financial strategy.
Of course when the system says I should buy at Walmart instead of Target, I'll have to refuse that advice. The next step in these financial systems should start monitoring the social and green aspects of the places you are shopping.
Semantic Hacker
Semantic Hacker has an open API for semantic discovery and is running a million dollar challenge to use their system in new and interesting ways. The system seems to pull out semantic data from any text source. There is an example on their site where you can paste text and get out the data. It seems mostly like a keyboard analysis system. It would be interesting to have someone comment on the system as it stands.
"Develop a software prototype, business plan or both with commercial viability that is focused on a vertical market. Solutions in finance, healthcare or pharmaceuticals might be good places to start."
SocialDevCampEast - Report from the field
https://barcamp.pbwiki.com/SocialDevCampEast
If you have ever been to an IEEE or such conference, you know that everything is planned ahead of time and there are committees for everything. It works well, but for a smaller situation like the Social Dev Camp East that happened yesterday, the power of Web2.0 technologies came together to promote a smooth flow throughout the day.
Take a look at the wiki link above. There is a proposed schedule. At a regular conference, that would BE the schedule. At the camp, there weren't even defined sessions before the campers voted on them in the morning. You can see the proposed time schedule got pushed back as people probably showed up a little later than expected. The actual schedule exists on the page along the proposed, showing the differences. Updated in almost real time, this is a huge break from the monolithic existence of regular conferences. It was encouraging and impressive that such an open source type of event flowed so smoothly.
The actual content of the day, and the overall purpose as I saw it was interesting. At a regular conference, the sessions might go for two hours each, starting at 8am and ending at 5pm. This camp has one hour sessions ending about 4pm. Each session was about an hour long on one topic. Since the topics were decided on in the morning, the presenters varied in their preparation. Some were ready to go with organized presentations, others were thrown together according to the requests of the campers.
Two sessions in the mornings with discussions during led to a discussion filled lunch followed by another two sessions in the afternoon. Each session had four different topics in individual rooms for a total of sixteen topics throughout the day. Following the sessions there was a camp sponsored open bar down the street at The Brewer's Art. After sixteen different topics were covered, people from D.C, Baltimore, Philly, NYC and other places with all kinds of technical backgrounds, made for plenty of conversation for hours after the official sessions were over.
In CMSC 491S/691S we had interesting lectures that led to discussions that had to be halted as the end of class came. Imagine sixteen lectures in one day and hours of discussions afterwards. Contacts, friends and business opportunities were made and the results of the camp will echo for quite some time.
Another camp will be held in Fall 2008. Baltimore, New York, or someplace else hasn't been announced but it will probably be bigger and better than this one.
Tuesday, May 06, 2008
BarCamp East Saturday May 10
Monday, May 05, 2008
I Need Information…
Today and tomorrow’s Internet capabilities are truly dependent on wireless communication technologies and inexpensive electronics- those of us in the software business should not forget that. Computer electronics products allow us to consume all kinds of information in a multitude of forms to suit our lifestyles- some of us sit at our computers reading e-mail, web pages, blogs and videos; others listen or watch streaming multimedia on their home entertainment systems; still others listen to podcasts portable audio players; and the mobile computing crowd does all of this and more on their computer-enabled phones (or are they cell phone-enabled computers?). In the next five years we will see all of these modalities grow in popularity- as home computers blend in with entertainment systems to support diverse sources of streaming audio and video (TiVo, Sling, Roku, Apple TV), and we take the most miniaturized forms of these electronic capabilities with us everywhere (even on highly active excursions like hiking, climbing, running, boating & swimming!) [3]. Mobile devices will be made tremendously more effective and convenient with Internet sites, services and applications geared toward convenience- beyond local content (traffic, shopping, weather, news, and events) and multimedia playback to include diverse communications- phone, messaging and even social networking [4,5].
When looking five years ahead, however, the most notable theme I anticipate is a simple twist on the long-standing premise that I began with: Whenever and wherever you would like to produce information, that is an opportunity for and computing technologies and particularly the Internet to help provide the means. We’ll be doing more than taking some geo-referenced pictures and uploading them- we’re talking about providing the context and commentary; micro-blogging on a massive and distributed scale; crowd-sourced coverage of live events; and even organizing flash mobs to create the events [6]. We will not just be consuming the information out there- we will be interacting with the world, and each other, and literally making the news.
I need information… and I need to produce information!
References:
[1] Akshay Java, class lecture on 2008-04-30 http://socialmedia.typepad.com/blog/files/socialmedia.pdf
[2] “The Long Tail” described on Wikipedia http://en.wikipedia.org/wiki/The_Long_Tail
[3] Travis Hudson “All Nike Shoes to Become Nike+ Compatible”, article in Gizmodo 2007-03-26 http://gizmodo.com/gadgets/portable-media/all-nike-shoes-to-become-nike%252B-compatible-247097.php
[4] Ellen Uzelac, “Mobile Travelers: Wireless devices, such as GPS units and cell phones, are transforming the way we vacation”, article in Baltimore Sun 2008-05-04
http://www.baltimoresun.com/travel/bal-tr.techtravel04may04,0,7453953.story
[5] "Mobile Social Network" described on Wikipedia http://en.wikipedia.org/wiki/Mobile_social_network
[6] Madison Park, “At harbor, 80s-tinged flash”, article in Baltimore Sun 2008-05-04 http://www.baltimoresun.com/news/local/bal-md.rickroll04may04,0,4649727.story
UMBC 2013
Of course it doesn’t really matter how fast iLife is, I still get along fine online. That is where everything is these days. I opted for a 500GB drive when I got my machine, but it is barely full, four years later. Everything is online. I use Google Documents for the few text docs I need to exchange with some backwater friends and keep everything else in my personal cloud out in Rwanda. Lack of natural resources, a growing population and increased prices from India and China, drove outsourcing to new locations. Technically, my data isn’t in Rwanda, it is all over the world. Natural disasters aren’t a thing of the past, but it is maintained by les Rwandais. It works fine as any in Bangladesh or Peru that my friends use. Enough about my extracurricular activities and me.
After Blackboard lost their patent suit while I was in high school, a swarm of open source systems sprang up to devour the previously off limits IP. UMBC switched over to one of them a year or so ago and it has been great. Built on the Mozilla Facebook platform, it has enabled collaborative wet dreams that professors trying to prep us for the real world could have only dreamed of when I was in junior high. Hell, Facebook barely existed back then. They had millions of users, sure but the interface was so simple. I looked you guys up before the tour and good job, you’ve learned how to use the fine grained access control, you’ll appreciate that in four years. Social graphing and reputation markets were just topics of research, but you use them everyday.
You may have heard of professors turning off the Internet back in 2008, that doesn’t happen at UMBC, it is used throughout the disciplines. Now that the semantic web is more reality than pipe dream, the web is more useful than ever. That required system they are quoting down at OIT and the bookstore is just a minimum, you might want to get something a little more powerful depending on your major. Right, you’ll want something more powerful whatever you are doing. Granted, much your system will exist online, but much of the rendering is done locally, so get an Intelvidia or AMD-ATI card in whatever you get. School is what you learn, but it is also who you network with, now more than ever in this connected world.
The world may be a little hotter, and gas a lot more expensive, but it sure seems like web technologies have helped make it a better place. Whatever you do here, you will be connected to everyone else on campus and around the world. No longer hindered by travel requirements, you may get a guest lecture from a Swedish designer on an all night design treatise or an explanation of biochemistry from Vietnam. They don’t call it an Honors University for nothing.
Thank you for visiting UMBC today. I know being outside in the big blue box on an August day in 2013 isn’t exactly your idea of fun, but it is good once in a while. For those using the iRobot telepresence units, please send them back to the Commons before you logout or you will be charged extra. School is a place you learn, not a place you are, but you might want to get on campus once in a while. You have such opportunity ahead of you and I think you have made a good decision choosing to go to UMBC. Class of 2018, my caps lock off for you! Sorry, that was a very very lame joke that I know only two of you understood.
The Web in 2013
The first and most obvious will be the adoption of XHTML 1.0 Transitional as a standard. Some web sites will be using pure XHTML, but the mainstream will be stuck on 1.0 Transitional due to the high percentage of users still running Windows XP and Internet Explorer 8.0 (the last supported version of IE on XP) with its horrific XHTML Strict rendering.
Another key difference will be the proliferation of high-resolution video advertisements. With 50Mbit fibre-to-the-home being standard, and 100Mbit available in some startup markets, high-definition video ads will have all but replaced the static and low-res animated graphics of 2008. Upon visiting sites, users will be bombarded with motion, forcing them to click on the ads or risk being sent into epileptic seizure.
Instead of the traditional Flash, video ads will be streamed out in standards-based MPEG-5 which will become the ISO standard for compressed 2160p High-Definition content. Thanks to the standards-based codecs used, playback in browsers will be accomplished via built-in code, no special plug-in will be required to view the embedded videos.
By 2013, frames and tables as layout crutches will have been all but eliminated from modern web sites. Instead, well-placed div tags will denote content while CSS scripts will tie everything together, effectively separating content from layout once and for all. All still using XHTML 1.0 Transitional however...
Instead of writing code by hand or using current craptastic programs that generate unreadable code, web developers will use a free open source toolkit for WYSIWYG development that generates completely readable XHTML and CSS code (including ECMAScript glue code that automagically works around known browser bugs/deficiencies).
These tools will use advanced NLP algorithms to automatically add semantic attribute data to pages. Users can manually tweak this attribute data (which will be represented as RDF embedded in XHTML), but for the most part the automatic processing will greatly increase the searchability and indexability of all web documents by providing standard semantic attributes which can be used by both search engines, mashup engines and query services.
As for the question of whether or not the web will be "better" than it is now... Things will be much more standards-compliant, although not necessarily compliant to the standards of 2013, but by todays standards an incredible improvement. In fact browsers of 2013 will block and refuse to render any pages which do not validate due to unspecified behavior. Because advertisements are more graphical, they tend to be more distracting than what we deal with today, but those same technologies allow for advanced interfaces with fluid 3-D effects and transitions making everything feel much more interactive.
But when it comes to actual content, people will be using this new enhanced web in much the same way as we use ours now, it will simply be flashier, interfaces will be more animated and fluid, and it will be much less bandwidth-efficient.
References:
http://en.wikipedia.org/wiki/XHTML
http://www.theregister.co.uk/2008/01/24/h20_sewer_rollout/
http://www.devarticles.com/c/a/Web-Style-Sheets/DIV-Based-Layout-with-CSS/
http://en.wikipedia.org/wiki/ECMAScript
Slow Growth in the Right Direction
HTML 5 is not set to finished for 10 to 15 more years. It will provide many needed upgrades. This includes a much desired concept: HTML5 doesn’t just define how valid documents are to be parsed, it also defines how parsing should work if documents are invalid, ill-formed, and broken, so that browser vendors can make their products fully interoperable with each other.
But that's 10 year away. In "web time," that is an eternity. So what do we do in the mean time? HTML 5 (or some other well, thought out solution) will be a new standard for developers and designers to use. Proper use of this new standard will allow for more semantic, consistent web pages.
The reason this sounds like a dream is most of the content on the web adhears to nothing. Perhaps, we should fill the next 5 to 10 years with this: teaching standards and CSS. Before semantic web, there's got to be a better mark up language. Before the better mark up language, people have to even get the point of standards. Part of those standards include seperating content from presentation, thus the the need for CSS.
None of this change we want is going to happen over night. As quickly as trends come and go on the web, the languages that it is written change at a much slower rate. There is a hodpoge of HTML version in use. Despite the fact that CSS 2 was released in 1997, many web developers and designs do not know or do not use CSS despite its obvious benefits. CSS adoption has been hinder by the very same thing as HTML has: cross-browser inconsistencies.
So when we push to teach standards to ourselves and to others, perhaps we should also include the browser programmers in this. Perhaps a half step in the progression is to put enough pressure on browser developers to provide similar and consistent output.
Where does that leave our grand plan?
- Consistent browser output
- Acceptance and practice of standards
- Thought-out, more semantic markup
- The web as it should be
- semantic data and content
- separate content and presentation mark-up
- robust page renderings and visualizations
To move foward, we need to hit each of these points square on the head. We need walk before we run. If you want use the popular wild west example, we need to bring law and order before we can advance. So what would this all look like?
I would love to something like this: (take from A List Apart)
With this, there should separate file for styling the output. Inside the layout blocks, should be semantically defined (I leave this how to the semantic researchers) content and data. A more semantic DOM would be probably help.
These features fully enabled and in practice, I can see search, api's, and data mining becoming far more practical and powerful. Advertising will continue to blossom; if it is easier for machine to understand a page, then the software can present better target ads.
Speakng of ads and money, the last and thing that would help all this is money. Search, advertisements, and business need to be able to see the clear benefits from these efforts. Before a company would ponder why it would need a website, now most companies launch a website at their conception and wouldn't do business with out one. Perhaps, one day, we could convince companies that they shouldn't launch a new venture without first creating a standards-based, accessible, semantic web site.
I look forward the to web of the future.
(Assignment #5) The shift from entity to tool
First, let us recall how not the web, but computers, looked not five, but fifteen years ago. That old old time when people were using DOS and when GUIs were a new thing. Think of the way people interacted with a computer back then -- through a command line interface -- the user issues a command and the computer executes it. Interaction with a computer was like a conversation. But then came a fancy Windows GUI and everything was done by clicking on buttons. Now the computer isn't the other side of a conversation, but a place to put the tools you use in. Those tools being applications used to manipulate documents, files, and their subparts. So the CLI to GUI shift is from interaction with "the computer" to interaction with what it manipulates, data. I am not saying that one kind of interaction is better than another, but merely that what the user interacts with is different. But since the average user would rather directly manipulate objects rather than have a civilized conversation with an automaton, the GUI interface won in popularity.
Now look back at the shift from Web 1.0 to Web 2.0: In Web 1.0 we had static web pages which were reached by typing a URL in the browser. So we have the user request a page, and the browser retrieves it, and that's it (similar to the CLI interaction, isn't it?) In today's Web 2.0 we have shifted our focus, again, to the content on the web, as opposed to the medium through which it is delivered. What I mean by this is that YouTube is used to retrieve a video, Facebook is used to interact with a person, Wikipedia is used to get information, etc. This shift to viewing websites as tools becomes more apparent when APIs rather than the websites themselves are used, or when mashups are created.
But this shift is not yet complete, which becomes apparent during the interaction with these websites. When someone goes to Google or Ask or YouTube or Wikipedia the first thing they do is enter search terms, and then they get their data. Even when we talk about searching the internet we sometimes say "Let me ask Wikipedia." "Wikipedia," clearly, is an entity when we look at it like that, and the next step would be to make the search engine's presence less apparent. Today's browsers are already on their way to make searches more transparent, if you enter search terms into Firefox's URL bar, for example, it will sometimes guess and immediately call Google's "I'm feeling lucky", or sometimes it will send you to the search results. Plugins and programs like Mash Maker go another step and truly act as tools that manipulate the data on the web.
Another way in which the web is becoming a tool rather than an entity is through desktop applications that use the web without explicitly invoking the browser. Yes, these have been present throughout the history of the web, but with higher bandwidths and wireless internet in more and more places they are becoming much more usable and much more popular.
So in 2013 our interaction with the web will be a lot more transparent, and more intimately integrated with the desktop experience.
The Future of The Web
Future Web: 2013
In 2013 the web will be a different but still very similar to the web of 2008. Since the web is built around the people using the web and people have a tendency to not change all that fast, so will the web change: not very rapidly. We will still likely have almost all of the technologies we use today with only a few additions, if any. Technologies always take time to catch on, even major breakthroughs. It took time for things like MySpace and Facebook to catch on, and it will take time for other technologies to catch on.
One technology that will exist in a more advanced form is the semantic web. I believe that Semantic Web technologies will play a role in the future web but not in the web that will exist 5 years from now. The semantic web is a complicated idea that will be revolutionary when it catches on, but cannot possibly catch on until it has been perfected.
Another key enhancement that will come about within 5 years is an increase in bandwidth. Video and flash applications are already becoming quite common on the web and their content and quality is limited only by the speed of the average user’s internet connection. Internet2 helped to create the high speed Abilene Network and the National Lambdarail project that is a start to much higher speeds than were previously possible. It is only a matter of time before technologies like those make it to the internet.
It is my opinion that the every time an advancement is made with web technology and usage, a bad aspect of this new tech is introduced but is immediately countered by a good as aspect that that same technology introduced. Facebook and other social networking applications allow unscrupulous users to ‘spy’ on people, but also allow people with similar ideas to congregate together in groups and share ideas and experiences that they otherwise would not have previously been able to share. The world will be no better or worse but things will easier for everyone; people with good and people with bad intentions.
Internet2 - http://en.wikipedia.org/wiki/Internet2
Slides - http://www.slideshare.net/hchen1/semantic-web-20-381520
The Future of the Web
Fast forward 10 years. The web today is cooked up in a variety of languages and frameworks and delivered to us using complex platforms that are built on a foundation of accessibility and scalability. Pages are beautifully styled, content is polished, and audiences are targeted and information is abundant. The evolution from the web of 1996 to the web of today was unlike anything we’ve seen before.
That brings us to the future. At this pace, what will the internet be like in 5 years – 2013? Some things will evolve faster than others; old ways will die out or become popular again; some new exciting technology may be introduced that changes all dynamics of the inter net, and if I had an idea of what it was, I would be rich. =)
So what will change?
Everything will be HD by default
I am a self-admitted news junky, so I spend a lot of time going through the hordes of websites providing news from various sources and reading user comments that help shape a stories impact. The single most frustrating thing about more of the news I read, though, is that the photos accompanying the story are low resolution – often 400x200 or some other 1996-era size. I predict news sites, especially mainstream sites like Reuters or CNN, will have galleries of HD pictures with each story. Each of these sites currently has a ‘pictures of the week’ or ‘pictures of the month’ photo galleries, but the images are low res. These sites feature stunning pictures that really have an impact on the story they are telling, but without access to the full quality, the whole story can never be told. So by 2013, most of the pictures we see on the net will be in high definition. The following prediction will make this possible.
Bandwidth – Not a problem
According to a report by the Wall Street Journal, the US is currently ranked 15th in the global broadband market, and that rank is falling. Bad business practices and lack of competition has stifled the growth of US broadband capabilities and we are not seeing the types of speed that countries like South Korea or most European countries. Hopefully the next president will be more open to the idea of net neutrality and take seriously our need for better access to the internet. Better broadband and more bandwidth will allow the publishing of high definition photos and video because more people will have access to them. Right now, bandwidth is the only roadblock standing in the way of the high definition web. (Companies save money by not publishing HD content; I personally find it disgusting that telecoms seem to be turning bandwidth into a commodity).
Web 3.0
The ‘semantic web’ is still a relatively new subject, with researchers scrambling to find a way to implement the technology that will once again change the way the web works. While I believe we will see successes in our progression to the semantic web, I think there will be intermediate steps along the way that promote ideas of the semantic web but lack the fully autonomous ‘agents’ that are at the core of the semantic web. We have to see a migration that is on the scale of Web 2.0 – that is we have to see incentives for business to invest in change. The incentive that is attracting companies now is advertising revenue. Mainstream companies bode well to develop rich content that invites user contribution, attracting an audience that can be advertised to. These mainstream companies play host to ‘average Joe’ users and those who are not web enthusiast who track the changes of the web as part of their thinking and understanding.
That’s why I think we will see a ‘Web 3.0’ before we see the full blown semantic web as envisioned by Tim Berners-Lees of the world. Web 3.0 will first attract users via quality, usable information found by intelligent searches and delivered automatically with the help of rdf-like languages. After successful broad-scale beta-like trials that see users utilizing Web 3.0, the big push will happen and mainstream companies will once again latch on and attract what I call the average Joes. The transition will be measurable, natural and one that will serve as the next platform for the evolving web.
Wrap up
After seeing how fast Web 2.0 and current standards came about, 5 years in the internet world will seem like a lifetime for developing technology. Hybrid applications will become the norm, IPTV will overtake regular TV, and personal web spaces will bind to users in a similar way as cell phones. Content will be, high-def be default, available for sharing and circulation, come majority from users like you and me. It is my hopes that the internet stays free from political manhandling and corporate strongholds. As net neutrality gains traction, it will the responsibility of the ‘you and me’s to ensure the internet’s freedom. The internet is the most democratic form of medium ever to exist, and the same power that is driving the web today (us) will keep it moving tomorrow.
References
“XHTML™ 1.0 The Extensible HyperText Markup Language (Second Edition)”, W3C,
26 January 2000. 1 May 2008. http://www.w3.org/TR/xhtml1/
Schatz, Amy. “U.S. Broadband Rank: 15th and Dropping”. 2007. Wall Street Journal, 1 May 2008.
Yihong-Ding. A simple picture of Web evolution. ZDNet, 5 November 2007. 1 May 2008. < http://blogs.zdnet.com/web2explorer/?p=408>
The state of the web in 2013
Perhaps the most significant difference is that there will be more "small screen" browsers accessing the web than "full screen" browsers. Mobile phones with real web browsers, will dominate the the User-Agent HTTP request header in server logs. The iPhone is just the beginning of enabling full use of the web on small mobile phone screens. In five years, many more people will be surfing the web from their mobile phone than from their laptop or desktop.
HTML5 will be a standards document, but browser support for it will not be complete, and even fewer sites will make use of it [1]. The good news is that HTML5 adds some minor semantic markup to the HTML specification, but it will not bring about the semantic web revolution [2]. I believe blogging software and other content management systems will have built-in support for semantically tagging content, but the majority of websites will either not use it or still develop sites with tools that do not handle semantic markup.
"Web 2.0" will have gone mainstream. Users will have their choice among a large selection of mostly interoperable online office suites including photo and video manipulation applications. And just about everything else will be accessed through a web browser, native platform applications will be thought of as "old school".
All in all, nothing revolutionary will happen in the next five years, only evolutionary extension of the current technologies.
[1] [http://blog.whatwg.org/html5-geekmeet]
[2] [http://www.alistapart.com/articles/previewofhtml5/]
Sunday, May 04, 2008
The Internet, 2013
Semantic web technologies play a larger role in the web in 2013. Users have tired of entering the same profile information on every new website they decide to join. This fact allowed foaf and other similar formats to amass great popularity throughout the internet. Once a user has created a profile that contains semantic data, it’s a matter of simply accessing the data from other websites.
In order to usher in this new internet age users could be made more knowledge-able so that they would be able to format their semantic data. Web sites could also provide scripts that would format their existing profiles into ways that would work with the semantic web of the future. A few modern day blogs already produce a foaf output that users can utilize.
The internet will be a better place for the typical user. For instance, people will no longer have to worry about the numerous scams they can be subjected to via the web. Spammers will be held more accountable because they can easily be tracked down and dealt with accordingly. Parents wont have to worry about what kind of content their children could be getting into because they can control what control what kind of websites they subscribe to via their monthly bill. However the users that new and loved the old internet will have to be coaxed into embracing these new restrictions.
2013 - Return of the Wild West
In the next five years the speed and ease of web publishing will economically shift information providers to primarily online distribution, rendering newspapers obsolete. This shift will threaten traditional notions of copyright and intellectual property. To cope with these changes the television, radio, and music industries will undergo radical changes.
Virtually all information of any kind will be online, and people will simply never look for information if it is not online. Libraries will move their collection into full text online documents and then shut their doors. Handheld book like electronic PDF readers will become the norm. For people who still prefer paper there will be a cheap printing and binding service (online order of course) that prints PDFs as hardcover or paperback books, and then mails them to customers.
Employers will all “google”, “facebook”, etc their employees first, and these results will highly influence their choices. Your online reputation will become more important than your credit score, and may cause you to be turned down for a job or refused for apartment rentals.
This will cause search engine optimization to intensify. Organizations will spring up to both protect your reputation, and damage the reputations of others. General lawlessness will prevail upon the web. Search engine optimization packages and intelligent personal data crawling agents will become like guns in the Wild West. There will be shootouts through the streets of the net, with nations, companies, and individuals vying for control of valuable online real estate in the forms of domain names, search results for query words, and “true” semantic information.
The victor will be determined by who possess the most technical skills, or who hires the most talented coders. These shootouts will pollute the web with false information that will leave the technical have-nots and the poor swimming in a sea of falsehood. Trust propagation algorithms will only be as effective (and trustworthy) as the coder who wrote them. The gap between the rich and the poor will increase. Everyone will have a voice, but some will shout louder than others and no one will know who or what to trust.
The problem might even get bad enough that someone will solve it.
Web 2013
Web was started as a means to facilitate communication and information exchange. In 1994, when launched, it was just html programming code to display text. Web was considered as 'information highway' and we saw start of dotcom boom with rise of e-commerce websites, like Amazon, eBay and email provideres like hotmail. After the dotcom bust, very few websites like Google, eBay, Amazon survived and, with the new technologies like AJAX, XML, RSS, Folksonomies, we see different genre of rich websites like gMail, wikis, Flickr, Youtube, Social networking sites like MySpace, Facebook, Bloggers, gMaps, Mash-ups. AJAX and similar technologies made applications provide faster, more seamless web user experience. Applications tend to be characterized by user generating content on web. In November 2006, there were over 8 billion websites online based on variety of interesting concepts targeting users of all ages.
[Reference: 10 Years That Changed the World]Web in 2013:
5 more years and you'll see -
Even if it's just a period of 5 years to 2013, I expect emergence of new web technologies that'll take the web user experience to next stage, and these frameworks will be easy to incorporate for web developers. We'll see more "featured" browsers implementing HTML5 standard. Mash-up technologies will continue to facilitate combination of data from different web sources, and will provide many new interesting and useful features. We'll see many applications being provided by relatively naïve developers. There'll be continued convergence of telecoms, social networking sites, semantic web and composite applications. We have seen recent developments happening on mobile application frameworks, like iphone SDK, Google Android SDK being available for mobile Application developers. There'll be more web applications where in the channel of information source will be cellphones.
With the WiFi web and improved Internet connectivity, we'll see 'always on' social communities, and there will be more applications similar to Twitter. There will be proliferation of different virtual social groups, and we'll see highly virtually socialized generation. Social networking sites will be more expressive. Current issues like privacy, security with social networking sites will remain and but I don't expect any rise of new adverse social and psychological problems like personality disorders or digital xenophobia in just 5 years. Life online will have changed the way we think.
Search engines will be more efficient and personalized and users will be able to get exact information. Web will continue to be an entertainment platform. Technology that govern 3D online digital content will continue to push limits of multimedia for web applications. Online game applications like World of Warcraft, SecondLife will continue to be popular. These applications will redefine the entertainment web applications. E-commerce websites will have more sophisticated recommendation systems making use of semantics and online social networks. Web applications will continue to satisfy human needs and will enhance those.
We have already seen social web and semantic web analysis tools playing important role in politics, and political campaigns. Online voting will have been incorporated. By 2013, we'll see new cyberlaws that will try to resolve issues related to privacy, intellectual property/copyrights. The creative commons will play vital role in the new copyrights control laws. Internet penetration in the world will increase, giving opportunities to consider potential web users from variety of socio-economic status, culture, ethnicity and geographical location.
In summary, web will continue to facilitate our need to connect, exchange, share, compete and network in better way.
FuTuRe WeB
So clearly Web 2.0 is a marketting term. And thats what I had thought. So its just the use of existing technologies in a different way that is convenient for users and attractive. Now I am in a better position to predict the FuTuRe WeB (This is how logos are constructed in Web 2.0 world according to the following video)
Jokes apart. Every coin has two sides. And I believe that future web will be a mix of good things and bad things.
Good things first.
The next generation web will be a symbiosis of intelligent nodes. Yes, the conventional client server model may not live. Rather a P2P kind of model will be in place. Let me explain why. In my opinion, heres how we reached the web that we currently have. As computers became cheaper, more and more people started buying them. Soon, the static nature of the computer data made them too boring for use as an entertainment box. Also, the information that could be found was limited. So now people started using Web. They liked it due to its dynamic nature and then with increasing usage there was a need for faster connections. As broadband was introduced, we started hosting videos and other rich content. Thus the way we use the web has changed with change in technology. And change in technology was driven by the demand for a better experience. Soon, we will reach a point where we won't have enough bandwidth to host rich high quality content. Then we will need to break the conventional client server model. User communities will share content on P2P based web decreasing the load on the servers. May be bittorrent will be run from within a browser. Browsers will act as clients as well as servers. This kind of model will be possible because every internet user will have his own domain. The names will be weird though.
As Sarah has already mentioned, we will have devices that run only browsers. Browsers will be the new OSs. Internet OS will play an important role. If the connection is fast enough, it won't be hard to download 1GB of web operating system code in few seconds. With such high bandwidth, internet OS will act as fast as the desktop OS does. Network will be used as a platform. P2P architecture will also make distributed computing over the web far more effective.
With web OS, we will be able to deal with an open platform. Apps will be portable, easily upgradable, free! and we will have a file format independent web. Open ID variants will be a norm.
Semantic web will definitely find its wide use. If we want companies to share their data, we may need to go with the licensing model or enable those companies to show ads along with the data. Semantic web concepts will be used between client and server as well as between two servers.
HTTP could get replaced by something else. Something that supports P2P structure well.
Like google has now become a verb, other terms used in computer terminology will start finding their space in the dictionary. E.g. If you don't like someone you will ctrl-c him/her. After a conference there will be F5-ments served. If you don't believe me, then you should watch this video.
Also, very thin sized computers will be available to drive the next web.
Now the bad part
Cyber crime will be a major concern. It will be very hard to track down the criminals due to the complex nature of the new web. There will be lot of privacy and security issues. Unless we educate the users, web won't be a safe place.
Life will be too much dependent on web. Would you be able to live without touching your computer for 8 consecutive days? People will be bored with social networking sites. They will realize that interacting with real people is more important than chatting with a stranger online. Overuse of web may have negative impact on people's physical and mental health. It is better to play some real sport rather than playing a computer game. Believe me, its good for health.
Serious attempts will be required on organizations' part to stop employees from wasting their time on social networking sites. Web should be used to make things convinient and easy and to make information accessible.
But good thing is that we will have tools to measure user's productivity.
Web in 2013
One application that is going to grow is social networking sites like Myspace, Facebook, Orkut. These sites have tremendous potential to grow in Asian countries like India and China. Every day in the morning I check my email accounts - google, yahoo, umbc [ in order of priority :) ] and then my Orkut account. I wont be surprised that by 2013 I would be checking my Orkut account first.
I dont think semantic web in its current form which is too demanding is going to be accepted. Thinking from a user perspective I dont see any strong use case for it to exist. The wine selection example which we saw in the class didn't impress me much. I would rather select wine from the menu card of restaurant rather than query the web. Yes but we need means to share, integrate, resuse data across applications and semantic web or similar technologies can be helpful only if they blend with the existing web. Semantic web in its improved form will take some time to annotate the world's information.
Looking at the rate which blogging is catching on the younger generation, I think every person would have an indentity/personality on web. With more an more user content being generated, privacy and security is going to be a great concern.
By 2013, web would be more personalized. Capturing more personal information about your users, knowing their preferences is a key area to concentrate on. Keeping users engaged in order to learn more about what each user wants, giving him everything at one website so that he does not have to leave the site and customizing the web for users seems to be good strategy.
For example,Project Joey, allows you to customize your mobile web experience by bringing the Web content you need most to your mobile phone.
I also feel the way in which web pages are rendered would undergo a dramatic change. More 3D animation with smooth surfaces would come to exist. In 2013, browsing the web would be a completely different experience with very less use of keyboard as we saw in the video. Voice recognition is certainly going to catch on, for sure. For example you can search in Google by just speaking rather than typing. Heres a similar example of Voice Web Search:
With so much of web around, there would be groups of people who would get frustrated and form communities like "We hate computers" but they would have to form such communities online! In short, you cannot escape web.
The web in 2013
The web in 2003:
-MySpace*, Meetup.com, LinkedIn*
-del.icio.us*
-Wikipedia
-LiveJournal, Blogger,
-Mapquest
-Amazon
-Ebay
-PayPal
A lot of the web sites that were known to only a few early adopters in 2003 are quite mainstream now. So the websites and ideas that will be wildly popular in 2013 probably exist now in some form.
Picking out which websites today will really take off in the next five years is difficult. I think that mashups will continue to grow in popularity. Commercial mashups will be created that pay licensing fees for their data.
I see semantic web technologies primarily being used behind the scenes to improve the web of today. The average web users of 2013 probably won't know what the semantic web is, but it will enable them to find the information they need every day. Sites like FreeBase will continue to grow and applications will be built to take advantage of their free structured data. Improvements in natural language processing will help the semantic web to use unstructured data.
Applications will continue to move off of the desktop and onto the web. Already slimmed down free versions of desktop applications are appearing online. GoogleDocs and Adobe Photoshop are good examples. Cheap online storage providers like JungleDisk or XDrive will start to replace large hard drives in home PCs. In the next few years I would not be surprised to see laptops being sold with only a web browser for an operating system.
Accessing the web through mobile devices will become more and more common, so more sites will appear that are optimized for viewing on those devices.
Watching TV shows online will become quite common. Because of this, advertisers will be forced to adapt and find new ways to reach consumers. Companies will try to create or buy internet memes. For example, a fast food restaurant will buy icanhascheezburger.com and turn it into a marketing slogan.
I think the web of 2013 will be an improvement on the web of today. I believe that it will become easier to find information and manage personal data.
Friday, May 02, 2008
Facebook applications = Spyware?
Does your social networking site need antispyware software? The BBC recently ran a test by writing a Facebook application to harvest the user's, and all the user's friends Facebook data after it was installed and run by the user. As stated, it didn't just steal the data of the person who installed the app, but from their friends too. Not really a virus, not technically spyware, and Facebook says that they will remove anything that behaves badly. Of course they would have to realize it is behaving badly first.
Thursday, May 01, 2008
Adobe opens swf/flv files to developers
The Open Screen Project is working to enable a consistent runtime environment – taking advantage of Adobe® Flash® Player and, in the future, Adobe AIR™ -- that will remove barriers for developers and designers as they publish content and applications across desktops and consumer devices, including phones, mobile internet devices (MIDs), and set top boxes. The Open Screen Project will address potential technology fragmentation by allowing the runtime technology to be updated seamlessly over the air on mobile devices. The consistent runtime environment will provide optimal performance across a variety of operating systems and devices, and ultimately provide the best experience to consumers.
Check out their project site here: http://www.adobe.com/openscreenproject/