Showing posts with label Information Technology. Show all posts
Showing posts with label Information Technology. Show all posts

Monday, April 30, 2007

Low-Energy LED Lighting for Streets and Buildings

The 1,500 foot long LED display on the Fremont Street Experience is currently the largest in the world.
Enlarge picture

A light-emitting diode (LED) is a semiconductor device that emits incoherent narrow-spectrum light through a form


of electroluminescence. LEDs are small extended sources with extra optics added to the chip, which emit a complex intensity spatial distribution. The color of the emitted light depends on the composition and condition of the semiconducting material used and can be infrared, visible or near-ultraviolet.

The Department of Trade and Industry-led Technology Programme in UK has funded a £175,000 ($350,000) grant to researchers at The University of Manchester to develop powerful low-cost LED lighting modules that can be used in buildings and on roads. Dialight Lumidrives - a company founded by a former student - is contributing another £175,000 to the scheme.

The main goal is to investigate how tightly packed groups of LEDs can be made to work safely and reliably, and with less energy consumption and lower sosts. LEDs lighting solutions have the potential to reduce energy consumption by between 25 and 50 per cent, depending on the applications.

Illumination applications using LEDs are already being used in advertising panels in the streets and for traffic lights, but their use in street and building lighting has yet to overcome some obstacles.

The technical ones involve thermal and electrical issues at the desired lighting levels (of 12,000 lumens and above, when a typical 60w household light bulb produces about 800 lumens), like the amount of heat generated by LEDs packed closely together.

Since the project aims to develop LED modules to be used outside, environmental factors will also be a concern, such as glare, pollution and even the possibility of a bird nesting over a vital heatsink.

Dr Roger Shuttleworth from the Power Conversion Group at The University of Manchester, said: "LED technology first came to prominence in instrument displays back in the 1970s, but we are increasingly seeing it used in things like traffic signals and car lights. Towards the end of the twentieth century, the old fashioned sodium street lights that made everything look orange were gradually replaced by high-pressure sodium lamps. While these are brighter and more aesthetically pleasing, and can help tackle street crime and anti-social behaviour, they are also less energy efficient. With the environment at the top of the public and political agenda, energy saving has become a very important issue. When you consider how many street lights there are in the UK alone, it's clear there are some big opportunities for energy and cost savings."

The many benefits of LEDs will include cutting energy consumption and overall running costs, reducing light pollution and the glow that radiates from big cities, and their longer lifespan which means they would need to be replaced less often, potentially cutting down on traffic disruption and local council repair bills.

Friday, March 16, 2007

Start-Up Fervor Shifts to Energy in Silicon Valley



 


J. Emilio Flores for The New York Times


Andrew Beebe is the president of Energy Innovations, which makes low-cost solar panels.
















Article Tools Sponsored By




Published: March 14, 2007



SAN FRANCISCO, March 13 — Silicon Valley’s dot-com era may be giving way to the watt-com era.


Skip to next paragraph



Related Series



The Energy Challenge


Articles examine the ways in which the world is, and is not, moving toward a more energy efficient future.



Go to Series »

Out of the ashes of the Internet bust, many technology veterans have regrouped and found a new mission in alternative energy: developing wind power, solar panels, ethanol plants and hydrogen-powered cars.

It is no secret that venture capitalists have begun pouring billions into energy-related start-ups with names like SunPower, Nanosolar and Lilliputian Systems.

But that interest is now spilling over to many others in Silicon Valley — lawyers, accountants, recruiters and publicists, all developing energy-oriented practices to cater to the cause.

The best and the brightest from leading business schools are pelting energy start-ups with résumés. And, of course, there are entrepreneurs from all backgrounds — but especially former dot-commers — who express a sense of wonder and purpose at the thought of transforming the $1 trillion domestic energy market while saving the planet.

“It’s like 1996,” said Andrew Beebe, one of the remade Internet entrepreneurs. In the boom, he ran Bigstep.com, which helped small businesses sell online. Today, he is president of Energy Innovations, which makes low-cost solar panels. “The Valley has found a new hot spot.”

Mr. Beebe said the Valley’s potential to generate change was vast. But he cautioned that a frenzy was mounting, the kind that could lead to overinvestment and poorly thought-out plans.

“We’ve started to see some of the bad side of the bubble activity starting to brew,” Mr. Beebe said.

The energy boomlet is part of a broader rebound that is benefiting all kinds of start-ups, including plenty that are focused on the Web. But for many in Silicon Valley, high tech has given way to “clean tech,” the shorthand term for innovations that are energy-efficient and environmentally friendly. Less fashionable is “green,” a word that suggests a greater interest in the environment than in profit.

The similarities to past booms are obvious, but the Valley has always run in cycles. It is a kind of renewable gold rush, a wealth- and technology-creating principle that is always looking for something around which to organize.

In this case, the energy sector is not so distant from other Silicon Valley specialties as it might appear, say those involved in the new wave of start-ups. The same silicon used to make computer chips converts sunlight into electricity on solar panels, while the bioscience used to make new drugs can be employed to develop better ethanol processing.

More broadly, the participants here say their whole approach to building new companies and industries is easily transferable to the energy world. But some wonder whether this is just an echo of the excessive optimism of the Internet boom. And even those most involved in the trend say the size of the market opportunity in energy is matched by immense hurdles.

Starting a clean technology firm is “not like starting an online do-it-yourself legal company,” said Dan Whaley, chief executive of Climos, a San Francisco company that is developing organic processes to remove carbon from the atmosphere. “Scientific credibility is the primary currency that drives the thing I’m working on.”

Just what that thing is, he would not specify. For competitive reasons, Mr. Whaley declined to get into details about his company’s technology. His advisory board includes prominent scientists, among them his mother, Margaret Leinen, the head of geosciences for the National Science Foundation.

In the last Silicon Valley cycle, Mr. Whaley’s help came from his father. In 1994, he did some of the early work from his father’s living room on GetThere.com, a travel site. It went public in 1999 and was bought by Sabre for $750 million in 2000.

This time around, entrepreneurs say they are not expecting such quick returns. In the Internet boom, the mantra was to change the world and get rich quick. This time, given the size and scope of the energy market, the idea is to change the world and get even richer — but somewhat more slowly.

Those drawn to the alternative-energy industry say that they need time to understand the energy technology, and to turn ideas into solid companies. After all, in contrast to the Internet boom, this time the companies will need actual manufactured products and customers.

“There are real business models and real products to be sold — established markets and growing economics,” said George Basile, who has a doctorate in biophysics from the University of California, Berkeley and specializes in energy issues.

MIT provides blueprint for future use of coal

Leading academics from an interdisciplinary MIT panel issued a report today that examines how the world can continue to use coal, an abundant and inexpensive fuel, in a way that mitigates, instead of worsens, the global warming crisis. The study, "The Future of Coal--Options for a Carbon Constrained World," advocates that the United States assume global leadership on this issue through adoption of significant policy actions.




Sponsored Links (Ads by Google)



UCG - Coal Gasification - Ergo Exergy provides UCG technology for underground coal gasification
www.ErgoExergy.com

WTE Simulation - Simulation tools for improved operation of a WTE plant
www.weel-sandvig.dk

Biogas collection - Collect Biogas, Earn Carbon Credits Floating Flexible Digester Covers
www.mpccontainment.com

Led by co-chairs John Deutch, Institute Professor, Department of Chemistry, and Ernest J. Moniz, Cecil and Ida Green Professor of Physics and Engineering Systems, the report states that carbon capture and sequestration (CCS) is the critical enabling technology to help reduce carbon dioxide emissions significantly while also allowing coal to meet the world's pressing energy needs.




According to Deutch, "As the world's leading energy user and greenhouse gas emitter, the U.S. must take the lead in showing the world CCS can work. Demonstration of technical, economic and institutional features of CCS at commercial scale coal combustion and conversion plants will give policymakers and the public confidence that a practical carbon mitigation control option exists, will reduce cost of CCS should carbon emission controls be adopted and will maintain the low-cost coal option in an environmentally acceptable manner."

Moniz added, "There are many opportunities for enhancing the performance of coal plants in a carbon-constrained world--higher efficiency generation, perhaps through new materials; novel approaches to gasification, CO2 capture and oxygen separation; and advanced system concepts, perhaps guided by a new generation of simulation tools. An aggressive R&D effort in the near term will yield significant dividends down the road and should be undertaken immediately to help meet this urgent scientific challenge."

Key findings in this study include:

-- Coal is a low-cost, per BTU, mainstay of both the developed and developing world, and its use is projected to increase. Because of coal's high carbon content, increasing use will exacerbate the problem of climate change unless coal plants are deployed with very high efficiency and large-scale CCS is implemented.

-- CCS is the critical enabling technology because it allows significant reduction in carbon dioxide emissions while allowing coal to meet future energy needs.

-- A significant charge on carbon emissions is needed in the relatively near term to increase the economic attractiveness of new technologies that avoid carbon emissions and specifically lead to large-scale CCS in the coming decades. We need large-scale demonstration projects of the technical, economic and environmental performance of an integrated CCS system. We should proceed with carbon sequestration projects as soon as possible. Several integrated large-scale demonstrations with appropriate measurement, monitoring and verification are needed in the United States over the next decade with government support. This is important for establishing public confidence for the very large-scale sequestration program anticipated in the future. The regulatory regime for large-scale commercial sequestration should be developed with a greater sense of urgency, with the Executive Office of the President leading an interagency process.



Sponsored Links (Ads by Google)



Coal Bed Methane
Major New International Study into Unconventional Hydrocarbons
www.woodmac.com


Coal Newsletters & Data
International & U.S. Coal News Coal & Petcoke Prices & Data
www.energypublishing.biz/


Coal Gasification 2005
Comprehensive industry report Free summary available
www.utilisenergy.com




-- The U.S. government should provide assistance only to coal projects with carbon dioxide capture in order to demonstrate technical, economic and environmental performance.

-- Today, Integrated Gasification Combined Cycle appears to be the economic choice for new coal plants with CCS. However, this could change with further research development and demonstration, so it is not appropriate to pick a single technology winner at this time, especially in light of the variability in coal type, access to sequestration sites and other factors. The government should provide assistance to several "first of their kind" coal utilization demonstration plants, but only with carbon capture.

-- Congress should remove any expectation that construction of new coal plants without carbon dioxide capture will be "grandfathered" and granted emission allowances in the event of future regulation. This is a perverse incentive to build coal plants without carbon dioxide capture today.

-- Emissions will be stabilized only through global adherence to carbon dioxide emission constraints. China and India are unlikely to adopt carbon constraints unless the United States does so and leads the way in the development of CCS technology.

-- Key changes must be made to the current Department of Energy research development and demonstration program to successfully promote CCS technologies. The program must provide for demonstration of CCS at scale; a wider range of technologies should be explored; and modeling and simulation of the comparative performance of integrated technology systems should be greatly enhanced.

The report is available online at http://web.mit.edu/coal .

Source: MIT

Tuesday, February 27, 2007

Speed Up Network Browsing

Network sharing was way superior to the Internet file sharing available through a modest modem. Therefore, we all enjoyed our local sharing protocol. Time passed, broadband connections became so spread and popular that the old modem found his imminent death.

Nowadays, we share over the Internet. At such high speeds, the local network rather spread itself outside the local enclosure. However, LAN is not dead. We still use the Local Area Network at the office or in the neighborhood. The only problem is that users are not satisfied when browsing the network.

It seems that communication



between network computers under Windows has some lacks slowing down browsing. Excluding hardware problems, which are not the basis of this article, some tweaks can be applied in order to smooth things out.

All the tweaks have to be done by editing the registry, which means you need to be careful when doing that. To keep yourself out of trouble, make sure you backup the registry before you edit it.

Disable Network Task Scheduler

Applying this tweak, you will disable networked computers search for scheduled tasks. When you try to open a network folder, it will take a while which is not pleasant at all.

Go to Start > Run and type Regedit. When the registry editor opens, locate this path:

HKEY_LOCAL_MACHINE > SOFTWARE > Microsoft > Windows > CurrentVersion > Explorer > RemoteComputer > NameSpace

Once you found it, just delete the following key:

{D6277990-4C6A-11CF-8D87-00AA0060F5BF}

It is possible not to find the keys mentioned above. It's OK. Just proceed to the next tweak.

Raise the threshold level for the requested buffer.

When dealing with a high-latency connection you need to modify (increase) the SizReqBuf value. We are talking here about a buffer, which is set by default to value of 4356 decimal. Microsoft states that this value provides acceptable level of performances under normal conditions. Well, as we are not satisfied how network browsing devolves we consider the “conditions” as being not normal and therefore, we need to change the value. It seems that in most LAN conditions, the best value for the SixReqBuf would be 16384. Use this value on computers equipped with more than 256 MB Ram.

To change the value, first open the Registry Editor (as presented at the previous tweak) and locate

HKEY_LOCAL_MACHINE > System > CurrentControlSet > Services > LanmanWorkstation > Parameters and then create a DWORD value named SizReqBuf. Edit it and provide a decimal value of 16384.

Tweak the Network Redirector Buffers

By increasing the number of these buffers, you may get a higher transfer rate for the data that travels though the network. Open the Registry Editor navigate to this location:

HKEY_LOCAL_MACHINE > System > CurrentControlSet > Services > LanmanWorkstation > Parameters

Using the procedure explained in the previous tweak, add two new DWORD values:

MaxCmds and MaxThreads

Give both the same value between 0 and 255. It is recommended to choose the value of 64.

Eliminate the shares from My Network Places

Windows has an annoying behavior to place a shortcut in My Network Places for each remote folder accessed through the network. This creates an unpleasant delay when accessing the network. There are two ways to teach Windows not to do that anymore.

For Windows XP Home Edition

Locate HKEY_CURRENT_USER > Software > Microsoft > Windows > CurrentVersion > Policies > Explorer with Registry Editor and add a new DWORD value called NoRecentDocsNetHood setting its value to 1. The value 1 will disable the shares to be added in My Network Places.

For Windows XP Profession

Under this version of Windows, the process is easier. There is no need to edit the registry. Just go to Start > Run and type Gpedit.msc. It will open the Group Policy Editor. Using it, just go to User Configuration > Administrative Templates > Desktop and in the right panel, enable the option: “Do not add shares of recently opened documents to My Network Places”.

Deploying Microsoft Office SharePoint Server 2007

Deploying Microsoft Office SharePoint Server 2007 will deliver a positive impact on the workflow inside an institution. Case in point-the Menninger Clinic. Microsoft revealed that the adoption



of Office SharePoint Server 2007 has reduced paperwork up to 25%. According to the Redmond Company, the clinic has adopted a single system with Microsoft Office SharePoint Server 2007 (MOSS) and Microsoft Office InfoPath 2007 at its basis.

“Microsoft SharePoint has been invaluable to us, as the application we were working with was becoming too complicated to maintain with one full-time employee and two consultants dedicated to its maintenance,” said Terry Janis, director of Information Technology at Menninger. “Now we can devote those funds to other projects that contribute to higher-quality patient care — all while fulfilling HIPAA requirements.”

“Office SharePoint Server 2007 offers Menninger the ability to easily store, manage and retrieve patient demographic information and clinical documentation,” said Chris Sullivan, Healthcare Provider Solutions Director of the U.S. Healthcare and Life Sciences Group at Microsoft. “A normalized relational database design for this application might require dozens of tables. The document-centric design of Menninger’s system uses SharePoint Server 2007 to reduce this to a handful of lists and document libraries.”

According to Microsoft, the Menninger Clinic, an international psychiatric hospital in Houston has managed to save $80,000 per year following the deployment of the new system, based on Office SharePoint Server 2007. Microsoft's announcement comes in concert with the Health Information Management and Systems Society’s annual IT conference for 2007.

Monday, February 19, 2007

5 Things the Boss Should Know About Spam Fighting

"Sysadmins and email administrators were asked to identify the one thing they wish the CIO understood about their efforts to fight spam. The CIO website is now running their five most important tips, in an effort to educate the corporate brass. Recommendations are mostly along the lines of informing corporate management; letting bosses know that there is no 'silver bullet', and that the battle will never really end. There's also a suggestion to educate on technical matters, bringing executives into the loop on terms like SMTP and POP. Their first recommendation, though, is to make sure no mail is lost. 'This is a risk management practice, and you need to decide where you want to put your risk. Would you rather risk getting spam with lower risk of losing/delaying messages you actually wanted to get, or would you rather risk losing/delaying legitimate messages with lower risk of spam? You can't have both, no matter how loudly you scream.'"

Getting Clueful: Five Things You Should Know About Fighting Spam


The battle for your users' e-mail inboxes probably will never end, but it's not a failure of technology. Experienced e-mail and system administrators share the key points they really, really wish you understood.

By Esther Schindler



February 15, 2007

When you started your e-mail client this morning, you were prepared for the usual set of correspondence: your daily dose of corporate politics, a dollop of technical emergencies and the background hum of projects under way. Annoyingly, your inbox also contained a few messages advertising products you would never buy, and perhaps a phishing notice warning that your account was frozen at a financial institution where you don't have an account. Your company has antispam measures in place; surely, the IT staff should be able to keep this junk out of your inbox?

Perhaps they can, but the task of doing so has become much more difficult in recent years, partly because 85 percent or more of all e-mail traffic today is spam. If you haven't been listening closely to the dark mutterings in your e-mail administrator's office, you may have missed out on significant clues about the nature of the problem and what the IT department can do to address it. However, when you do listen to the technical staff, it's easy to get lost in their arcane acronyms, such as SPF and RBLs, and you may drown in more information than you really wanted to know.

To learn what's really happening in the technical trenches, we asked several e-mail administrators to tell us about the key items—the single key item, in fact—that they wish their IT management understood. If you read through their wish list, you may be able to understand the nature of their challenges and, perhaps, help them clean out your inbox.

In brief, says Keith Brooks, vice president at Vanessa Brooks, "Stopping spam is a mixture of luck, intelligence, alcohol and planning." With luck, he says, your CEO never hears about spam. "But without it, the CIO never stops hearing about this issue."

1. Lose No Mail.

The primary directive, for e-mail admins, is "lose no mail." If that means that an occasional spam message wends its merry way into users' mailboxes, so be it. E-mail administrators would prefer that users encounter a few annoyances than miss an important business message.

Dr. Ken Olum, a research assistant professor in the Tufts Institute of Cosmology, also maintains the institute's computers. Olum explains, "The most important thing is never to silently drop an important e-mail. If you just drop it, your correspondent thinks you aren't answering on purpose or forgets all about you. So suspected spam should always be rejected and never dropped. Sequestering it is only slightly better than dropping it, because you have to look through the sequestered spam, and most people don't bother."

Nonetheless, many CIOs ask their IT department to keep the e-mail boxes clear of anything offensive. Yet, according to Scott Kitterman of ControlledMail.com, "I want zero spam and I want to never ever miss a legitimate message" isn't feasible. Kitterman explains, "This is a risk management practice, and you need to decide where you want to put your risk. Would you rather risk getting spam with lower risk of losing/delaying messages you actually wanted to get, or would you rather risk losing/delaying legitimate messages with lower risk of spam? You can't have both, no matter how loudly you scream."

Tom Limoncelli, author of The Practice of System and Network Administration (Addison-Wesley) and Time Management for System Administrators (O'Reilly), stresses that because fighting spam is not an exact science, there always will be false positives and false negatives. The IT department has to cope with this. Limoncelli had a CTO complain when he missed an important message because it was caught in the spam filter. Says Limoncelli, "This system sent him e-mail once a day with a list of his messages that had been blocked; clicking on any of them 'releases' it from the quarantine. … He wanted a report for every message that was blocked. At least that was his initial request; he then realized that he had asked for an e-mail to warn him of every e-mail!"



2. There's No Silver Bullet.

In many areas of IT, the long-term solution is a simple one: Adopt the single right methodology, hire the right consultant, buy the most appropriate product. But your IT staff wants you to understand that spam isn't a problem that can be solved with a single technology, a single product or any one answer.

Vendors of spam-fighting hardware and software will tell you different—but they're wrong. Bill Cole, senior technical specialist at T-Systems North America, has been fighting spam for more than a decade. Everyone involved in that fight, he says, dreams of the "Final Ultimate Solution to the Spam Problem." But, he cautions, people who yearn for a single answer may fall prey to a vendor's magical "answer," but "in a year or so, the magic is gone and the spammers have adapted." Then, he notes, "managers get upset, a new 'solution' gets deployed, and the cycle goes around again."

Brad Knowles, a consultant, author, and former senior Internet mail systems administrator for AOL, adds, "In almost all cases, the so-called 'simple' answers are the ones that don't work. In fact, they're almost always the ones that make the problem much worse than it already was. Since we've been fighting spam for over a decade, pretty much all the good simple ideas have already been thought of and implemented, and the spammers have already worked around them."

Unfortunately, the result is that fighting spam is a complex endeavor. Says Knowles, "You're probably going to have to use multiple solutions from multiple sources. You're going to have to keep a constant eye on things to make sure that, when they blow up, you find out as quickly as possible. And you [need] multiple layers of business continuity plans in place to handle the situation."

3. It's a Continuous Battle. Budget Accordingly.

Spammers succeed only when they get messages to user inboxes, so they are motivated to counter any barrier between them and their intended recipient. As a result, your IT department will never be done implementing solutions.

Points out David Linn, computer systems analyst III at Vanderbilt, "Spam pushers update their tools as fast as the spam defenders work out a defense to yesterday's attack type. This seems to be the thing that those who want to buy an off-the-shelf solution and then forget about it least understand and least want to understand. The very speed of innovation that makes 'Internet time' so attractive in other contexts is the enemy here."

Cole describes spam as mail that evolves and adapts and thus requires an adaptive and evolutionary approach to defense. Spam cannot be handled as a discrete project with a list of deliverables and a three-month project plan. While you may initially have success by doing so, he says, "Expect to repeat the exercise again next year, and the year after that, and on infinitely."

This is a major nuisance to managers, because they have to pay a staff of high-skill people (either directly or indirectly) for ongoing open-ended work. As Cole notes, "Like many other areas of security, it is a potential bottomless pit for computing resources and the best technical staff and hence for money, so drawing the lines on it are a managerial challenge."

Martin Schuster, in charge of IT at CenterPoint, argues the business case for spam defense by extending spam fighting past technical and ethical issues (such as, say, forcing everyone to use PNG instead of GIF, not use special characters in file names, and so on). Schuster focuses on the financial cost and motivations, from the cost of sending spam to the cost of removing it (from infrastructure to manually deleting messages). He comments, "Fighting spam costs money. If your mail server administrator talks to you about fighting spam, and wants equipment and time to implement it, listen to him. His haircut may seem weird, but he's talking about saving money."

Adam Moskowitz, a Boston-area independent consultant and author of Budgeting for Sysadmins, says, "If a sysadmin can't show that fighting spam is costing the company money, then that sysadmin has no business talking to management about the problem. If the sysadmin doesn't understand and can't demonstrate how fighting spam affects the company's bottom line, upper management certainly isn't going to be able to make the connection."

Does all this seem insurmountable, given your company's resources? If you aren't willing or able to manage the e-mail and spam measures yourself, outsource it. Plenty of hosted e-mail service providers can handle part or all of a company's e-mail system. According to Limoncelli, "The spam system has to be upgraded constantly. This can fill an entire full-time position. If you don't have that kind of staffing, the best solution is to let someone else handle it."

4. Understand the Basics of E-mail Technology.

Administrator Micheal Espinola Jr. says his primary wish is for "top management to understand the mechanics of how e-mail works. Then, and I believe only then, would they be able to grasp the concepts that elude most users of e-mail." When management has the right information, Espinola believes, it can make excellent decisions, but a lack of understanding can severely hinder that ability. "If the admin is wasting time troubleshooting or improvising because of subpar technology, it takes away from time spent for the productivity issues of others."

This doesn't mean you have to become a guru on the subject; just learn enough to understand what your e-mail administrator is telling you. Michael Silver, network administrator at Parkland Regional Library, emphasizes, "A great deal of difficulty arises when trying to address spam—and e-mail problems in general—if the people involved don't have a good understanding of how the mail system works, including a basic understanding of the different protocols, services, etc. I don't expect [CIOs] to know the ins and outs of configuring sendmail, but [they] should have a basic understanding of terms like POP, SMTP, IMAP, MTA and MUA." Added an admin named Eric, "If the CIO knows and understands the mechanisms of how e-mail is received and sent, then explaining the need for additional servers, bandwidth, storage, redundancy, etc., is accomplished much more easily. ... Once you understand that, you get a very good insight in the shortcomings of the SMTP protocol and how/why spam is becoming such a huge problem and cost nightmare."

While most admins want you to understand e-mail basics to make it easier to explain corporate challenges, sometimes it gets personal. Larry Ware, Federal Signal Global Network Boffin, is frustrated by managers who don't understand how the technology works. "They spent some money for some software; why is spam still getting in? Even worse: Why did the system block mail from my nephew? He is running a mail server on his cable modem; he clearly knows how to set up a mail system, why can't you? Explaining why his nephew's mail server is in dozens of public blocking lists for being a spam cannon is a lot harder than you might think. How do you do it without implying his nephew is an idiot?"

Another side effect of the lax understanding of e-mail technology is that the entire system is misused, with spam only one tiny part. Stewart Dean, a Unix system admin at Bard College, says, "The result is users who regard e-mail as a sort of problematic tool that might as well be magic. Not understanding it, they bang on it and misuse it in the most preposterous ways." According to Dean, that's why your e-mail admin screams when users attach a 200MB file to a mail message without knowing that it was 200MB or even what 200MB means. Then those same users wonder why it doesn't go through. Worse, they then repeatedly resend the message. Finally, Dean says, "they get furious at IT that the goddamn magic isn't working."

5. People are Making Money on Spam. Respond Appropriately.

Most of e-mail administrators' time is spent dealing with technology issues or trying to explain it to you in business terms. But for some, the issue is a larger one: someone else's business model. They want you to understand that spam is sent by an intelligent, adaptable and well-funded enemy. Some admins believe that with corporate budgets and legal resources, it's even possible to fight back.

Brent Jones, network technician at Smarsh Financial Technologies, wants IT management to understand that someone is working very hard to destroy the spam barriers administrators put in place. "There is a large financial incentive [for spammers] to get their spam into your mailbox," he says. "They will fight to get your eyes, and it costs them nothing to try everything in the book."

Nor are spammers ordinary businessmen. Alessandro Vesely, a freelance programmer and service provider in Milano, Italy, points out that "much spam is the result of criminal actions, such as infecting IT systems and using false identities. Technically, spam can be stopped if everybody else wants to be responsible for what they send. What lacks is the political will to do so."

Sam Varshavchik is an independent contract consultant who serves many of the better-known financial firms on Wall Street. He believes strongly that "CIOs should stop giving their business to Internet providers with a bad track record of engaging in spam support services and instead encourage and support—with their budgets—lesser-known but more socially responsible and respected providers of data and Internet service." If CIOs instituted a policy of disqualifying any vendor of Internet, data or communication services that appears anywhere on Spamhaus's top 10 list from doing any business with the company, Varshavchik feels, "the spam problem will pretty much disappear, mostly overnight." Few CIOs who are considering vendors take the time to do so, he says, and those few minutes can save an untold amount of grief.

Perhaps you'll take some of the e-mail admins' advice; perhaps not. But they desperately wish that company management would support them in the endeavor to clean up users' e-mail inboxes. Fritz Borgsted, a system engineer at Unicorn Communications who also leads the development of ASSP (Anti-Spam SMTP Proxy, an open-source project), believes that fighting spam reflects the quality of life in the digital age. Borgsted says, "A mailbox without spam is like a private restroom; with spam, it looks like a public one."



Monday, February 12, 2007

Intel demonstrates 80-core processor

Now that the Megahertz race has faded into the distance (we hear it was a myth), Intel is well and truly kicking off the start of a multi-core war with the demonstration of an 80-core research processor in San Francisco last week. It's not the first multi-core processor to reach double figures -- a company called ClearSpeed put 96 cores onto one of its CPUs -- but it's the first to be accompanied by the aim of making it generally available; an aim that Intel hopes to realize within a five year timeframe. The long time frame is required because current operating systems and software don't take full advantage of the benefits of multi-core processors. In order for Intel to successfully market processors with CPUs that have more than say, 4 cores, there needs to be an equal effort from software programmers, which is why producing an 80-core processor is only half the battle. On paper, 80-cores sounds impressive, but when the software isn't doing anything imaginative with them it's actually rather disappointing: during a demonstration, Intel could only manage to get 1 Teraflop out of the chip, a figure which many medium- to high-end graphics cards are easily capable of. The multi-core war may have begun, but the battle will be fought with software, although that's not to say that the hardware side has already been won: apparently the test chip is much larger than equivalent chips -- 275 mm squared, versus a typical Core 2 Duo's 143 mm squared -- and Intel currently has no way to hook up memory to the chip. Hopefully half a decade should be long enough to sort out these "issues."

[Thanks, Michael]

U.S. cyber counterattack: Bomb 'em one way or the other

National Cyber Response Coordination Group establishing proper response to cyberattacks





San Francisco — If the United States found itself under a major cyberattack aimed at undermining the nation’s critical information infrastructure, the Department of Defense is prepared, based on the authority of the president, to launch a cyber counterattack or an actual bombing of an attack source.


The primary group responsible for analyzing the need for any cyber counterstrike is the National Cyber Response Coordination Group (NCRCG). The three key members of the NCRCG, who hail from the US-CERT computer-readiness team, the Department of Justice and the Defense Department, this week described how they would seek to coordinate a national response in the event of a major cyber-event from a known attacker.

This week’s massive but unsuccessful denial-of-service (DoS) attack on the Internet’s root DNS, which targeted military and other networks, did not rise to the level of requiring response, but made the possibility of a massive Internet collapse more real than theoretical. Had the attack been successful there may have been a cyber counterstrike from the United States, said Mark Hall, director of the international information assurance program for the Defense Department and the Defense Department co-chair to the NCRCG, who spoke on the topic of cyber-response during the RSA Conference here.

“We have to be able to respond,” Hall said. “We need to be in a coordinated response.”

He noted that the Defense Department networks, subject to millions of probes each day, has “the biggest target on its back.”

But a smooth cyber-response remains a work in progress. The NCRCG’s three co-chairs acknowledge it’s not simple coordinating communications and information-gathering across government and industry even in the best of circumstances, much less if a significant portion of the Internet or traditional voice communications were suddenly struck down. But they asserted the NCRCG is “ready to stand up” to confront a catastrophic cyber-event to defend the country.

“We’re working with key vendors to bring the right talent together for a mitigation strategy,” said Jerry Dixon, deputy director for operations for the National Cyber Security Division at US-CERT. “We recognize much infrastructure is operated by the private sector.” The U.S. government has conducted cyber war games in its CyberStorm exercise last year and is planning a second one.

Computer Software and Services Trends

--Downsizing: Distributed computing with client/server solutions
is most common in corporate IT, in conjunction with keeping the
economical functions of the mainframe environment, preferably
with identical GUIs and operation procedures. Major suppliers
catering to this trend are Novell; NSC Group (Legacy Downsizing);
Downsizing Systems; Suite Software; Fenger + Graetzer.

--Client/Server Solutions: This trend combines equipment from
different manufacturers, performance classes, and applications in
a network with the server managing data transaction, and the
client handling the presentation and local application
processing. As a result of the trend toward distributed
networking, standard software products must be able to run on
various platforms and in a variety of network configurations.
Thus, suppliers of utilities/tools software can create new
features and functions to allow the end-user to utilize the
expanded application areas. Major suppliers to this segment are:
Oracle; Novell; ACI; dc soft; CAI; CA; WIN!; Wilken; Onmis
Software.

-- Outsourcing: Outsourcing of software and IT services is
becoming more and more popular in medium-sized and even smaller
firms who engage specialist teams for specialist IT tasks such as
web server operation and maintenance. This trend should allow
U.S. software suppliers to find profitable niches, although these
clients usually prefer the "outsource" to be "in-country. " Major
outsourcing software suppliers in Germany are: AC Service; IIS
Infotech; IBM; SBS; SAP; Debis, and J.D. Edwards.

--Integrated Standard Software: Competition in the standard
packaged software segment has become harder over the past two to
three years. As a result, prices have decidedly dropped and this
pressure is expected to continue. Thus, the trend is away from
standard software and toward a component software product; still,
it does remain a receptive market for standard software
products which are priced competitively, especially business
application products and enterprise resource planning (ERP)
systems. Important suppliers in this segment are: SAP; Baan;
SSA; JDE; JBA; CA; PeopleSoft; Oracle.

--Implementation of MIS (Management Information Systems): A large
number of corporations have meanwhile installed data warehouses.
In order to fully achieve a competitive advantage, experts
recommend a combination of data warehousing with "data mining,"
the latest in "intelligent" data management, which combines
statistical with artificial intelligence functions to detect
patterns and trends in an otherwise fully automated data
analysis. Data mining tools can perform classifications and
segmentation resulting in large cost-savings and are used in
determining future customer behavior or structuring a complex
target group. MIS processes encompass everything from statistical
methods to expert systems to fuzzy logic to neuro-networks. Major
suppliers to this trend are: Angoss; Datamind; IBM; ISL; Isoft;
NeoVista; Pilot; SAS; Silicon Graphics; SPSS; Syllogic; Thinking
Machines; Informix; NCR; Oracle; and Xsys. Major local suppliers
are: Bissantz; Kueppers and Co.; Business Objects; and Cognos.

--Business Reengineering: Software supporting process-oriented
taskings such as identifying core operations; intensifying
customer orientation; improving the information flow; shortening
processing time; precisely allocating processing costs; and
reducing delivery times and amount of stored goods, will continue
to meet high demand in. Suppliers of respective software
tools are: Aeneis; Aris; Bonapart; BPWin; ERWin; Logichain;
Moogo; Proplan; and Sycat.

Saturday, February 10, 2007

NTT DoCoMo Achieves Data Transmission of Almost 5Gbps



Japan's leading mobile operator, NTT DoCoMo came close to hitting another data transmission milestone while researching the next-generation cellular phone systems during an experiment conducted last year, on December



the 25th in Yokosuka, Kanagawa Prefecture, Japan.

The company managed to transmit data at almost 5GBps to a receiver that was moving at 10km per hour, according to Yoshiki Kakuda, a spokesman for the Tokyo Company.

He also said that the company's public relations department didn't know the exact speed that was achieved, but that it was quite close to 5GBps. NTT DoCoMO has been working on the 4G technology for almost 10 years but they've still got a long way to go before it is launched.

The company is currently planning to launch 4G in Japan by 2010 but also indicated that it first plans to speed-up its WCDMA 3G network before switching to a new technology.

Approximately a year ago, NTT DoCoMo managed to transmit data at 2.5GBps, and this time they managed to double that by increasing the number of MIMO transmitting and receiving antennas from six to twelve and using better signal processing in the receiver.

As compared to the test in December 2005, the frequency spectrum efficiency was also doubled from 25bps/Hz to 50bps/Hz.The mobile operator also plans to launch a 'Super 3G' service by 2010 that will offer data transmission at approximately 100Mbps. NTT DoCoMO will present all the details on this recent experiment next week at the 3GSM World Congress in Barcelona.

The company has also announced that it is looking forward to contributing to the global standardization of 4G through its ongoing research and development.

Wednesday, January 31, 2007

Microsoft Tops Corporate-Reputation Survey

"Microsoft beat out Johnson & Johnson for the top spot in the annual Wall Street Journal survey of the reputations of U.S. companies. Bill Gates's personal philanthropy boosted the public's opinion of Microsoft, helping to end J&J's seven-year run at No. 1. From the article: 'Mr. Gates demonstrates how much the reputation of a corporate leader can rub off on his company. Formerly chief executive officer and now chairman of Microsoft, he contributed to a marked improvement in the company's emotional appeal. Jeanie Cummins, a survey respondent and homemaker in Olive Hill, Ky., says Mr. Gates's philanthropy made her a much bigger fan of Microsoft. "He showed he cared more for people than all the money he made building Microsoft from the ground up," she says. "I wish all the other big shots could do something like this." To be sure, some respondents still complain that Microsoft bullies its competitors and unfairly monopolizes the software business. But such criticism is less biting and less pervasive than it was just a few years ago.'"

Strong Passwords

Usually, when creating an account you will have to provide a user name and a password. I say “usually” as sometimes these are generated automatically and sent to you. Most users choose a regular ID (username), something representative (in the case of automatically generated IDs, it will usually be your email address).

With passwords, things are a bit more complicated as the protection of a sensitive content is one feature that should be available.

When it comes to cracking a password, hackers use two methods: password recovery and “brute force” repetitive. The first one consists in making the system believe that the hacker is an authorized user or administrator. Brute force is actually a software that repetitively processes letter, number and symbol combinations for finding the right elements of your password (it can try hundreds of passwords per minute). Given the adequate dictionary (sometimes the hacker may know a little about your habits and way of thinking) and enough time, any password can be cracked.

So why password protection if it is impossible to keep your data safe by simply applying a countersign? The only element that will discourage hackers from cracking your password is time. A weak password can be learned in just a few minutes (that is something any hacker has) while a very strong



one can take up to days. The stronger the password, the more time will be needed to crack it. After a couple of hours, most criminals give up if the "pot" is not important enough.

A weak password is actually any word or expression. But the key to an excellent countersign is for it to be lengthy and incorporate as many symbols (“@”, “#”, “*” etc.) or special characters, period, comma, hyphen, space bar) and letters (both upper and lower case) as possible. The difficulty resides in the fact that one has to use all of these elements in a password that is easy to remember.

Creating a weak password is easy, as you can choose any word you want. Browsing over the Internet I learned that a six characters password is only OK, which in my opinion means it is fallible. A ten characters pass key is considered to be good by the majority, while a 15 characters long countersign is unanimously considered to be the best (at 14 characters and less Windows passwords are scrambled as hashes and stored in hidden Windows system files, but Windows will not store hashed passwords of 15 or longer characters). Even Microsoft acknowledges that a 15-character password with only random letters and numbers is 33,000 times stronger then an 8 characters pass with elements from the entire keyboard.

Unfortunately, some computers or online systems have a limit in what concerns the length of the countersign and a 15-character password is not supported. However, you can use all sorts of tricks for creating a strong, memorable countersign with less then 15 characters (you have the keyboard and your imagination to use).

First of all, think of a word or multi-word phrase that is meaningful to you. It doesn't matter how lengthy it is, but don't turn it into a paragraph. In my example, I will stat from “softpedia”. This password, despite the fact that it has 9 characters, reached only weak level on the strength scale provided by Microsoft. By making different combinations of characters on my keyboard, I will try to pump it up to strong level.

The first step is combining upper case letters with lower case ones, so the result should look like this: “SoFtPeDia”. This simple trick already pumped it to medium level. Combining and replacing the letters with symbols and special characters will contribute to enforcing your password. Changing “e” with “3”, “a” with “@”, “1” or “i” with “!” or turning “g” into “6”, “s” into “$” and “o” into “0” (zero) can result in creating strong passwords.

By following the above mentioned strategy and replacing the letters with other characters I should now get “$0FtP3D!@”. It looks good and the effects of the changes brought my password to strong a level of security. And to get it to best security level all I have to do is add “eez#1”. This way, I have turned a phrase (“Softpedia is number one”) into a very hard to crack password (“$0FtP3D!@eez#1”). There are 14 characters, but by adding spaces between the words, you can ensure it not to be hashed and deposited in Windows hidden system folders.

Generally, you should avoid creating passwords by using repetitive (1111) or sequential numbers (123456). It has been proven that a blank password (no password at all) is more effective. Just misspelling a word or typing it by replacing the letters with symbols or numbers will not fool a good hacker, but used together will definitely concur to creating a strong countersign.

Contrary to the popular belief that passwords should not be stored on paper, it has been proven that countersigns saved this way benefit from a better protection then if stored in password managers or somewhere on the computer. Of course, writing the password on a piece of paper and not keeping it in a safe place will also result in weak security and all the trouble of making it strong will be useless.

7 Ways to Be Mistaken for a Spammer

"The "This is Spam" button popping up on many service providers' email services can be empowering for a user, but it can also be the kiss of death for a legitimate business that gets canned with a click of that button. Dark Reading has a story on seven common missteps that can lead to a case of mistaken spammmer identity for a legit business trying to send its marketing email, newsletters or other correspondence."

Why "Yahoo" Is The #1 Search Term On Google

"Google Trends indicates that over the course of the past year the search term "Yahoo" became more popular than "sex", making it the #1 query on Google. Yahoo apparently faces a similar dilemma with roles reversed: When you search for "Google" on Yahoo, Yahoo thoughtfully displays a second search box as if to tell you, "Hey cutie, you have a search engine right in front of you!" A puzzling phenomenon? An strange aberration?"

MySQL Prepares To Go Public

"MySQL CEO Marten Mickos told Computer Business Review the company plans to go public: 'Now entering its twelfth year, the company has built up just less than 10,000 paying customers, and an installed base estimated to be close to 10 million... When it does go public, MySQL will be one of only a handful of open source vendors to do so. Red Hat, VA Linux (now VA Software), and Caldera (now SCO Group) led the way in 1999 and 2000...'"

Tuesday, January 30, 2007

'Dumb Terminals' Can Be a Smart Move for Companies

"More companies are forgoing desktop and laptop computers for dumb terminals — reversing a trend toward powerful individual machines that has been in motion for two decades, the Wall Street Journal reports. 'Because the terminals have no moving parts such as fans or hard drives that can break, the machines typically require less maintenance and last longer than PCs. Mark Margevicius, an analyst at research firm Gartner Inc., estimates companies can save 10% to 40% in computer-management costs when switching to terminals from desktops. In addition, the basic terminals appear to offer improved security. Because the systems are designed to keep data on a server, sensitive information isn't lost if a terminal gets lost, stolen or damaged. And if security programs or other applications need to be updated, the new software is installed on only the central servers, rather than on all the individual PCs scattered throughout a network.'"

iPod Shuffle colors released!

In five brilliant colors and just $79, the 1GB iPod shuffle lets you wear up to 240 songs1 on your sleeve. Or your lapel. Or your belt. Clip on iPod shuffle and wear it as a badge of musical devotion.



iPod headphones

One size fits all


You know what they say about good things and small packages. But when something 1.62 inches long and about half an ounce holds up to 240 songs, “good” and “small” don’t cut it. Especially when you can listen to your music for up to 12 continuous hours.2 In fact, iPod shuffle just may be the biggest thing in small.

Silver iPod shuffle clipped to the edge of the page

Ready to wear


Clip it to your coin pocket. Clip it to your bag. No matter where you clip your skip-free iPod shuffle, you’ll have instant access to music. In silver, pink, green, blue, and orange, iPod shuffle goes with everything. Put it on, turn it up, and turn some heads.

Shuffle symbol

Remix and match


The first step to wearing 240 songs is downloading iTunes — free. Then drop your iPod shuffle into the included dock, plug the dock into your Mac or PC’s USB port, and sync in minutes. Got more than 240 songs? No problem. Let iTunes autofill your iPod shuffle and get a new musical experience every time.

  1. 1GB = 1 billion bytes; actual formatted capacity less. Song capacity is based on 4 minutes per song and 128-Kbps AAC encoding; actual capacity varies by content.

  2. Rechargeable batteries have a limited number of charge cycles and may eventually need to be replaced. Battery life and number of charge cycles vary by use and settings. See www.apple.com/batteries for more information.

Monday, January 29, 2007

The Future Microchips Will Be Smaller, Faster and Will Lose Much Less Energy

One of the most important findings in transistors in the last four decades enhances the much desired development of smaller and more powerful microchips.

Intel Corp. and IBM made the technological breakthrough by using an exotic new material: tiny switches that are the building blocks of microchips. "At the transistor level, we haven't changed the basic materials since the 1960s. So it's a real big breakthrough," said Dan Hutcheson, head of VLSI Research, an industry consultant. "Moore's Law was coming to a grinding halt," an allusion to the industry maxim formulated by Intel co-founder Gordon Moore, who said that the number of transistors on a chip doubles roughly every two years. As a result of this principle, the chips have been smaller and faster, in an industry of $250 billion in annual sales.

The new technology, based on the metal named hafnium, helps the development of circuitry as small as 45 nanometers (1:2000 of the width


of a human hair). "We do expect that those products will deliver higher performance levels than existing products," said Steve Smith, vice president of Intel's digital enterprise group operations. "What we're seeing is excellent double-digit performance gains on media applications."

The new technology could last at least two technology generations, when circuitry will get 22 nm dimensions. "We've been doing this for 40 years and we've got to the point where some of these layers you have to make smaller wouldn't scale anymore," said IBM Chief Technologist Bernie Meyerson. "We are getting down to a stage of technology where people have wondered if you could really ever go there, and we have definitely shown a roadmap down to these unbelievably tiny dimensions," said Meyerson.

The current technology has a 5 atoms thick layer of silicon-based material, thus a lot of electricity leaks out, causing wasted power and shorter battery life. "It's like running two faucets when you only need one. You're actually wasting more water than you're actually using," said Jim McGregor, an analyst at the technology market research firm In-Stat.

These are the benefits of the new technology: smaller transistors, potentially doubling their total number in a given area, and also faster ones, with a speed increase of over 20 %, and a decrease in power leakage by over 80 %. "Consumers are going towards mobility and power-sensitive solutions. We need to not only make things smaller and more efficient but also use less power," McGregor said.

But there are many obstacles in continuing the development of new chip generations: it is increasingly difficult to create light beams narrow enough to etch circuitry on chips. "But this takes out what has been considered the biggest number one roadblock," VLSI's Hutcheson said.

The Satellite Receiving Multimedia Car of the Future

The current car radios present many inconveniences: crackling radio stations, signal loss in tunnels and difficult tuning to the correct frequency.

But recently, ESA and nine partners in the industry and service sectors have presented a new prototype at the Noordwijk Space Expo, in the Netherlands, of what would be the multimedia car radio of the future.

The new car radio functions like a satellite receiver for television channels. Instead of a large dish antenna on the car's roof, there is a special mobile antenna, flattened so that it can be integrated almost invisibly into the bodywork that picks up signals in the Ku frequency band employed by communications



satellites.

The idea of integrating a satellite receiver in a car is not a novelty as in the US, more than 13 million drivers utilize the services of XM-radio and Sirius radio that broadcast to mobile satellite receivers. That is done via satellites, but enhanced by a rural network of transmitter pillars.

However, the new European multimedia system is much more advanced. Instead of new satellites and a web of ground-based transmitters which would require a huge investment, overpassing a billion Euro, the new system employs only the already existing communication satellites.

Moreover, the mobile multimedia system uses a cache memory (a hard disk or its solid-state equivalent). The received signals can be recorded (like in personal video recorders) and listened after a short time shift or much later. This ingenious technique surpasses the problem of signal loss in tunnels or obstructions disturbing the program.

This way, the driver is able to select a part of the broadcast to listen to, or pause a broadcast he/she is interested in as they stop to fuel.

The engineers were forced in their approach to develop an entirely new antenna that could be easily integrated in a car, as the large, fixed dish antennas designed to broadcast television signals via satellites were excluded for not being practical.

The project has taken three years of research, but the new technology possesses a great potential for the car industry and broadcasting.

Photo credit: BMW

Who Killed the Webmaster?

Back in the frontier days of the web–when flaming skulls, scrolling marquees, and rainbow divider lines dominated the landscape–”Webmaster” was a vaunted, almost mythical, title. The Webmaster was a techno-shaman versed the black arts needed to make words and images appear on this new-fangled Information Superhighway. With the rise of the Webmaster coinciding with the explosive growth of the web, everyone predicted the birth of a new, well paying, and in-demand profession. Yet in 2007, this person has somehow vanished; even the term is scarcely mentioned. What happened? A decade later I’m left wondering “Who killed the Webmaster?”

Suspect #1: The march of technology


By 2000, I think every person in the developed world had a brother-in-law who created websites on the side. Armed with Frontpage and a pirated copy of Photoshop, he’d charge a reasonable fee per page (though posting more than three images cost extra.)

Eventually the web hit equilibrium and just having a website didn’t make a company hip and cutting-edge. Now management demanded that their website look better than the site immediately ranked above in search results. And as expensive as the sites were, ought they not “do something” too? Companies increasingly wanted an exceptional website requiring a sophisticated combination of talent to pull off. HTML and FTP skills, as useful as they had been, were no longer a sharp enough tool in the Webmaster’s toolbox. Technologies such as CSS and multi-tier web application development rapidly made WYSIWYG editors useless for all but ordinary websites. And with the explosion of competition and possibilities on the Internet few businesses were willing to pay for “ordinary”.

In 1995, the “professional web design firm” was single, talented person working from home. Today it’s a diverse team of back-end developers, front-end developers, graphic artists, UI designers, database and systems administrators, search engine marketing experts, analytics specialists, copywriters, editors, and project managers. The industry has simply grown so specialized, so quickly, for one person to hardly be a master of anything more than a single strand in the web.

Suspect #2: Is it the economy, stupid?


Then again, perhaps the disappearance of the Webmaster can better be explained by an underwhelming economy rather than overwhelming technology. Riding high on the bull market of the late 90’s, companies were increasingly willing to assume more risk to reach potential customers. This was especially true of small businesses, which traditionally have miniscule advertising and marketing budgets. Everyone wanted a piece of the Internet pie and each turned to the Webmaster to deliver. More than just a few Webmasters made a respectable living by cranking out a couple $500 websites every week.

Once the bubble burst in early 2000, the dot-com hangover left many small businesses clutching their heads and checking their wallets. As companies braced to solely maintain what they already had, the first cut inevitably was to marketing and advertising. In-house Webmasters were summarily let go, their duties hastily transferred to an already overworked office manager. Freelance Webmasters were hit even harder as business owners struggled to first take care of their own. The gold rush had crumbled to fools’ gold even faster than it had started.

While a few Webmaster were able to weather the storm—mostly those with either extraordinary skills or a gainfully employed spouse—the majority were forced to abandon their budding profession and return to the world of the mundane.

Suspect #3: The rise of Web 2.0


Another strong possibility is that the Internet has simply evolved beyond the Webmaster. “Web 2.0″ is the naked emperor of technological neologisms; we all nod our head at the term but then stammer when pressed for a definition. As far as I can tell, Web 2.0 is mostly about rounded corners, low-contrast pastel colors, and domain names with missing vowels. But it also seems to be about an emphasis on social collaboration. This may seem like a no-brainer given the connectedness of the Internet itself; however, thinking back to Web 1.0 there was a distinct lack of this philosophy. Web 1.0 was more an arms race to build “mindshare” and “eyeballs” in order to make it to the top of the hill with the most venture capital. Even the Web 1.0 term of “portal” conjures up an image of Lewis Carroll’s Alice tumbling down a hole and into an experience wholly managed by the resident experts–the Webmasters. Despite the power and promises to be so much more, the web wasn’t much different than network television or print. Even the most interesting and successful business models of the Web 1.0 era could have been accomplished years prior with an automated telephone system.

It wasn’t until after the failure of the initial experiment did people begin to rethink the entire concept of the Internet. Was the Webmaster as gatekeeper really necessary? If we all have a story to share, why can’t everyone contribute to the collective experience? Perhaps it was the overabundance Herman Miller chairs, but Web 1.0 was inarguably about style over substance. Yet, as anyone who’s ever visited MySpace can attest, today content is king. With all of us simultaneously contributing and consuming on blogs, MySpace, YouTube, Flickr, Digg, and SecondLife, who needs a Webmaster anymore?