Critical Actions for the CIOs
by: Jerry Liao
Accepting changes brought about by the development of information technology is not easy, keeping up with the changes is definitely a challenge. And the one man in the corporate structure who will absorb most ofthe challenges is the person incharge of the entirecorporate I.T. infrastructure – the Chief Information Officers (CIOs).
The IT organization is in transition and is seeing its focus shift dramatically from technology to business processes and relationships. By 2010, 50 percent of IT organizations will refocus on brokering services and shaping business demand, rather than on delivering IT services directly. This is up from 5 percent in 2004, according to Gartner. This fundamental change in focus will drive new styles of IT organizations, new roles for IT service businesses and new functions for chief information officers (CIOs). CIOs must take decisions concerning the direction of their organizational change as soon as possible, because they will take at least two years to execute.
By 2012, at least 50 percent of large IT organization will divide into two parts, one focused on technology sourcing and delivery, the other on architecture and change. “CIOs need to lead transformation and to adapt their own roles as they do so. CIOs who master leadership will blend business and technology capabilities in their teams and in themselves,” said John Mahoney, research vice-president and co-chair of the Transformation of IT – CIO Summit, which will happen on 11-12 September 2007, in Barcelona, Spain.
In order to facilitate this transformation, Mr Mahoney said that CIOs will have to take the following five critical actions in the next 18 months.
1- Choose the Main Value Focus of the Department (previously)Known as IT and Rename It
When planning for the future focus of the IT organization, CIOs should not dwell on the existing IT organizational chart and boundaries but rather think in terms of the IT organization’s required capabilities, and consider how they can best be delivered in their company. “The management information system (MIS), information technology (IT), information system (IS) organization of today is no longer a monolithic organization but an organizational unit made up of complementary sets of roles. These roles hinge on creative collaboration between business and traditional IT professionals,” said Mr Mahoney.
2- Implement Policies to Reduce Complexity
Complexity is becoming the chief enemy of effective IT management. It inhibits cost management, transparency and adaptability. “Today, it is not only necessary to rationalise IT environments after the past two years of cost-cutting, but also to think critically about the root causes of technology proliferation and variation,” added Mr Mahoney.
CIOs must establish carefully crafted policies that will, over time, refine their organizations’ IT architecture, rationalise IT suppliers and leverage information. They must ensure that their organizations’ business case processes examine conforming to policies before requesting project approval, and that their organizations’ architecture addresses effectiveness in the actual business context, not just technical elegance.
3- Decide When Your Default Source of IT Infrastructure Should Be External
IT-utility-style computing technologies and the maturing service provider market will drive many organizations to externally source infrastructure services that are not mission-critical, joining established services (such as LAN/WAN provisioning help desk and other functions) that are typically outsourced today.
Although in-house data centre will continue to have an advantage in enterprises whose IT infrastructure is mission-critical, almost all IT organizations will use external providers for non-critical or transitory capacity. The outsourcing of most enterprise data centre infrastructure will grow from a few early adopters now to the mainstream majority by 2015. By 2015, more than 75 percent of IT infrastructure will be purchased as a usage-based service from external and internal providers.
“This will be a long, complex transformation, and advance planning is essential. IT leaders must decide by early 2008 when their default source for base-layer IT infrastructure should be external, and identify the critical architecture and management capabilities that must remain in-house,” said Mr Mahoney.
4- Decide Which Services, Metrics and Incentives Map to Business Outcomes
As businesses take advantage of partnerships to create agility and growth, the importance of managing by business outcomes increases. By 2010, at least 50 percent of new outsourcing deals will use measures based on business outcomes, not IT service levels.
When developing their sourcing strategy, user organizations need to select the style of outcome they intend in relation to the business value of the activity. Once this is decided they can then select appropriate service provider partners.
5- Identify and Start Building Competencies for Your organization’s Future Value Focus
Mr Mahoney concluded by saying that clear guidance for the IT team and the organization is essential for coherence, effectiveness and safety. It also minimises the need for repeated management intervention to solve similar problems. He advised CIOs to focus IT organization competencies on the disciplines of leadership (people management, strategy, sourcing, and service management) and governance (architecture and infrastructure, security, asset management and process management).
The capability requirements for each discipline will vary depending on the nature of the role of IT in the strategy of the business as a whole, however, it will engender an environment where business, technology and business relationships can be successfully fused to deliver real business advantage.
The challenges will continue and everyone within the business ecosystem should adapt and innovate to maintain their current position / status in the workplace.
The Skype Network Outage
by: Jerry Liao
The local I.T. industry was surprised with the sudden replacement of Antonio Javier as Managing Director of Microsoft Philippines, that is why not much attention was given to the Skype outage. A problem that rendered most Skype users around the world in the dark. Millions of its users were unable to make phone calls or to send instant messages via the popular Internet-based service.
What exactly happen? Skype has this to say:
“On Thursday, 16th August 2007, the Skype peer-to-peer network became unstable and suffered a critical disruption. The disruption was triggered by a massive restart of our users’ computers across the globe within a very short timeframe as they re-booted after receiving a routine set of patches through Windows Update. The high number of restarts affected Skype’s network resources. This caused a flood of log-in requests, which, combined with the lack of peer-to-peer network resources, prompted a chain reaction that had a critical impact.
Normally Skype’s peer-to-peer network has an inbuilt ability to self-heal, however, this event revealed a previously unseen software bug within the network resource allocation algorithm which prevented the self-healing function from working quickly. Regrettably, as a result of this disruption, Skype was unavailable to the majority of its users for approximately two days.
The issue has now been identified explicitly within Skype. We can confirm categorically that no malicious activities were attributed or that our users’ security was not, at any point, at risk. This disruption was unprecedented in terms of its impact and scope. We would like to point out that very few technologies or communications networks today are guaranteed to operate without interruptions.
We are very proud that over the four years of its operation, Skype has provided a technically resilient communications tool to millions of people worldwide. Skype has now identified and already introduced a number of improvements to its software to ensure that our users will not be similarly affected in the unlikely possibility of this combination of events recurring.
The Skype community of users has been incredibly supportive and we are very grateful for all their good wishes.”
Initially, the breakdown was feared to have been disabled by hackers. Later, it was said that it was the result of a software bug, caused by a massive restart among users who had downloaded a routine Windows patch from Microsoft. To avoid further misunderstanding, Skype answered some questions to clarify as to what really happen:
1. Are we blaming Microsoft for what happened?
We (Skype) don’t blame anyone but ourselves. The Microsoft Update patches were merely a catalyst — a trigger — for a series of events that led to the disruption of Skype, not the root cause of it. And Microsoft has been very helpful and supportive throughout. The high number of post-update reboots affected Skype’s network resources. This caused a flood of log-in requests, which, combined with the lack of peer-to-peer network resources at the time, prompted a chain reaction that had a critical impact. The self-healing mechanisms of the P2P network upon which Skype’s software runs have worked well in the past. Simply put, every single time Skype has needed to recover from reboots that naturally accompany a routine Windows Update, there hasn’t been a problem.
2. What was different about this set of Microsoft update patches?
In short – there was nothing different about this set of Microsoft patches. During a joint call soon after problems were detected, Skype and Microsoft engineers went through the list of patches that had been pushed out. We ruled each one out as a possible cause for Skype’s problems. We also walked through the standard Windows Update process to understand it better and to ensure that nothing in the process had changed from the past (and nothing had). The Microsoft team was fantastic to work with, and after going through the potential causes, it appeared clearer than ever to us that our software’s P2P network management algorithm was not tuned to take into account a combination of high load and supernode rebooting.
3. How come previous Microsoft update patches didn’t cause disruption?
That’s because the update patches were not the cause of the disruption. In previous instances where a large number of supernodes in the P2P network were rebooted, other factors of a “perfect storm” had not been present. That is, there had not been such a combination of high usage load during supernode rebooting. As a result, P2P network resources were allocated efficiently and self-healing worked fast enough to overcome the challenge.
The impact of the said breakdown proves two things: one – the technology is widely accepted and is widely used already, its no longer an alternative and two – the marketplace expects the same level of responsibility and accountability that it demands from a public utility. Users now rely on Skype in the same way as they expect to rely on their phone systems.
Today, the Internet is no longer just a source of information, it is a way of communication – VoIP services, email, newsgroups, IM, and social networking sites are now all part of our lives and this cannot be denied. Service providers should realize this and should do their best in ensuring the availability of these services.
New computer chip cooling method created
by: Jerry Liao
While most of us computer users are more concern about speed and storage capacity, computer and chip manufacturers are more concerned on other aspects of this technology – HEAT. In computers and electronics, power equals heat, finding ways to manage the heat generated in more powerful laptops and handheld computers is always a challenge.
Researchers have demonstrated a new technology using tiny “ionic wind engines” that might dramatically improve computer chip cooling, possibly addressing a looming threat to future advances in computers and electronics. The Purdue University researchers, in work funded by Intel Corp., have shown that the technology increased the “heat-transfer coefficient,” which describes the cooling rate, by as much as 250 percent.
“Other experimental cooling-enhancement approaches might give you a 40 percent or a 50 percent improvement,” said Suresh Garimella, a professor of mechanical engineering at Purdue. “A 250 percent improvement is quite unusual.” When used in combination with a conventional fan, the experimental device enhanced the fan’s effectiveness by increasing airflow to the surface of a mock computer chip. The new technology could help engineers design thinner laptop computers that run cooler than today’s machines.
The new cooling technology could be introduced in computers within three years if researchers are able to miniaturize it and make the system rugged enough, Garimella said. As the technology is further developed, such cooling devices might be integrated into portable consumer electronics products, including cell phones. Advanced cooling technologies are needed to help industry meet the conflicting goals of developing more compact and lightweight computers that are still powerful enough to run high-intensity programs for video games and other graphics-laden applications.
The experimental cooling device, which was fabricated on top of a mock computer chip, works by generating ions – or electrically charged atoms – using electrodes placed near one another. The device contained a positively charged wire, or anode, and negatively charged electrodes, called cathodes. The anode was positioned about 10 millimeters above the cathodes. When voltage was passed through the device, the negatively charged electrodes discharged electrons toward the positively charged anode. Along the way, the electrons collided with air molecules, producing positively charged ions, which were then attracted back toward the negatively charged electrodes, creating an “ionic wind.” This breeze increased the airflow on the surface of the experimental chip.
Conventional cooling technologies are limited by a principle called the “no-slip” effect – as air flows over an object, the air molecules nearest the surface remain stationary. The molecules farther away from the surface move progressively faster. This phenomenon hinders computer cooling because it restricts airflow where it is most needed, directly on the chip’s hot surface.
The new approach potentially solves this problem by using the ionic wind effect in combination with a conventional fan to create airflow immediately adjacent to the chip’s surface, Fisher said. The device was created at Purdue’s Birck Nanotechnology Center in the university’s Discovery Park. The researchers quantified the cooling effect with infrared imaging, which showed the technology reduced heating from about 60 degrees Celsius – or 140 degrees Fahrenheit – to about 35 degrees C, or 95 F.
The researchers also have developed computational models to track the flow of electrons and ions generated by the device, information needed for designing future systems using the technology. Computer chips are constantly being upgraded by creating designs with more densely packed circuits, transistors and other electronic components. The number of transistors per chip has been doubling every 18 months or so, in line with a general principle called Moore’s law. As performance increases, however, so does heat generation, particularly in small hot spots. These hot spots not only hinder performance, but also could damage or destroy delicate circuitry. This means new cooling methods will be required for more powerful computers in the future.
The next step in the research will be to reduce the size of components within the device from the scale of millimeters to microns, or millionths of a meter. Miniaturizing the technology will be critical to applying the method to computers and consumer electronics, allowing the device to operate at lower voltage and to cool small hot spots, Garimella said.
Another challenge will be making the technology rugged enough for commercial applications. “As things get smaller, they get more delicate, so we need to strengthen all the elements. And we believe we can achieve this goal in a year or so,” Garimella said.
BSAs Follow SAM! Campaign
by: Jerry Liao
Remember the 2001 Sean Penn movie “I am SAM” where he played a mentally retarded man who fought for the custody of his 7-year-old daughter (Dakota Fanning), and in the process teaches his lawyer (Michelle Pfeiffer) the value of love and family. My article has nothing to do with being retarted, but more about the fact that SAM cannot care for her daughter because of his condition but he actually can. My story is the opposite – an organization pretending to care but it has another agenda under its sleeves.
A couple of weeks back, I attended a Business Software Alliance (BSA) conference. BSA is a trade group representing a number of the world’s largest software makers with a task to stop copyright infringement of software produced by its members, among others. Simply put, if you or your company are using illegal software of its members, then BSA can come after you.
The BSA event was about their Follow SAM! campaign, an educational campaign which aims to encourage the adoption of proper Software Asset Management (SAM) among companies. The Follow SAM! campaign is a call for companies to adopt a model software asset management practice in their organization in order to fully maximize their IT resources and avoid the use of unlicensed software. Companies who wants to join the campaign should first submit a software audit report and declaration letter on SAM adoption. The reports and declaration letters will then be evaluated and verified by BSA members, consisting of the world’s leading software manufacturers.
Software asset management or sometimes called Software Distribution Management is the practice of integrating people, processes and technology to allow software licenses and usage to be systematically tracked, evaluated and managed. The goal of SAM is to reduce IT expenditures, human resource overhead and risks inherent in owning and managing software assets. SAM includes maintaining software license compliance; tracking the inventory and usage of software assets; and maintaining standard policies and procedures surrounding the installation, deployment, configuration, and use of software assets. SAM represents the software component of IT asset management, which also includes hardware asset management.
During the said event, BSA official were at hand to answer questions and my question was: “Is this BSAs effort to determine who among the local companies are using licenced or unlicensed software?”, of course the answer is a NO. BSA official were quick to say that the FOllow SAM! campaign involves a lot of objectives and encouraging companies to use license software is one, but is not the main purpose.
I asked again if the Follow SAM! campaign can be applied to companies who are using open source applications, BSA officials said YES. I also asked if Follow SAM can be applied to companies who are using unlicensed softwares, BSA officals said NO, they said what is their to manage if there was no software investment made on the first place. True. But at the same time, I thought using licensed software is just part and parcel of the entire Follow SAM! campaign. And the first requirement wherein companies should submit a software audit report automatically eliminates unlicensed software users.
Philippine Software Industry Association (PSIA) president Fermin “Tarcs” Taruc was also at the event and I suggested to him that perhaps Follow SAM! should be spearheaded by his group rather than by BSA. Why? Because companies will have doubts about the sincerity of BSA in helping them implement SAM. First and foremost, the goal of BSA is to ask or even force companies to use licensed software and that cannot be changed – whether you do it via education campaign or whatever, at the end of the day, companies should use licensed software. And because of this, companies might even be scared to work with BSA and throw away the Follow SAM! campaign altogether.
To prove my point, I visited the BSA website and part of the benefits a company will get from joining the Follow SAM! campaign is companies with verified software audits will receive a program certificate from BSA, and will be awarded one-year immunity from BSA-initiated enforcement actions.. Another benefit is companies will be given 14 days to rectify licensing discrepancies, while non-certified companies face the risk of action without warning at any time. Benefits that concerns licensing.
I am all for software asset management, because I know how important software is to every business, and managing it and treating it as a company asset is as important as treating other company resources. Poor software management can cost a company, in terms of efficiency, productivity as well as financially. But SAM should not be about enforcement, it should be about education and realization about its importance. Companies should be told the advantages of having licensed software rather than forcing them to use one because it is the law.
Let me just say that with all due respect, I don’t think BSA is the right agency to implement this. It may have been successfully implemented in other countries, but I doubt if it can be duplicated here in the Philippines. I hope I am wrong.
By the way, the campaign is being endorsed by the Department of Trade and Industry (DTI), the Commission on Information and Communications Technology (CICT), the Bureau of Internal Revenue (BIR) and the Intellectual Property Office of the Philippines (IP Philippines).
Microsoft regional officers made a surprising move by appointing Mr. Rafael “Pepeng” Rollan as the new Managing Director of Microsoft Philippines replacing Mr. Antonio “TJ” Javier. No details were given with regards to the reason behind the sudden change of the guards. Mr. Javier will still head Microsoft Philippines until September 30, 2007.
In a telephone conversation prior to the announcement, Mr. Javier said he has been serving as Microsoft Philippines managing director for five years now when normally the tenure is usually three years. Mr. Javier also said that Microsoft Philippines ended its fiscal year with a 56% growth, one of the best if not the best performance of a Microsoft subsidiary in Asia.
Prior to his appointment, Mr. Rollan serves as Microsoft’s Enterprise Partner Group Director. Rollan will serve as Managing Director and Enterprise Partner Group Director concurrently starting October 1, 2007.
New Global Internet Channel Launched to Find Missing Children
by: Jerry Liao
I am sure most of you who have access to the Internet have visited YouTube or any other video sharing website. You have watched your favorite music video, television show, movie trailers, personal videos and a lot more – most if not all are classified are entertainment videos. But is that all the video we can upload to video sharing website? Can we use these free websites to help do some social works? Like looking for missing children. The short answer – Yes and it’s here and available now.
The International Centre for Missing & Exploited Children (ICMEC), in partnership with Google’s YouTube, and The Find Madeleine Campaign today announced the creation of a new initiative that will provide worldwide exposure to information and videos of missing children. A new YouTube Missing Children’s Channel has been created exclusively for posting videos of missing children. The new channel can be found at http://www.youtube.com/DontYouForgetAboutMe.
Case information and videos of missing children will first be submitted to ICMEC for review and verification before posting on the new channel. ICMEC will work with analysts at the National Center for Missing & Exploited Children (NCMEC), local and national law enforcement on U.S. cases, and with Interpol on international cases to confirm it is an open case and verify the details of the case and video. After the case information has been certified, it will be forwarded to YouTube for posting. Anyone with information about a missing child featured on the website will be directed to contact the appropriate law enforcement agency.
“Every year hundreds of thousands of children go missing around the world and some are abducted to other countries, creating unique challenges for law enforcement and family members searching for them,” said Ernie Allen, President and CEO of ICMEC. “In the U.S. alone, nearly 800,000 children are missing each year or about 2,000 each day. Photos remain the single most effective tool for finding a missing child. This new resource will provide unprecedented exposure for missing children, reaching potentially millions of viewers every day and increasing the opportunity that someone has seen them.”
The timing of the announcement coincides with the 100th day since Madeleine McCann went missing. Madeleine disappeared on May 3, 2007 while on a family vacation in Portugal. This past June, Madeleine’s parents, Gerry and Kate McCann, sought ICMEC’s assistance to create an international resource that would quickly disseminate pictures of missing children throughout the world. Gerry McCann recently visited the headquarters of NCMEC and ICMEC in Virginia where he and Allen discussed the need for disseminating information and images of missing children on a broader, global basis.
“Kate and I are really enthusiastic about this powerful new resource,” said Gerry McCann, Madeleine’s father. “We believe it will help in the search for Madeleine and many other children. We are grateful to ICMEC for its leadership on behalf of our child and so many others.”
YouTube, which is owned by Google, is a popular video sharing website and leader in online video. Its popularity and global reach made YouTube a natural choice as a partner in this project. The channel’s headline banner, “Don’t You Forget About Me,” is named after the hit song by the Scottish rock group “Simple Minds.”
In addition to information and videos of missing children, the channel will include child safety and educational materials in several languages as well as Public Service Announcements and messages from dignitaries, celebrities and others including First Lady Laura Bush and soccer star David Beckham.
There will be no cost to post a video of a missing child on the new channel. Instructions and criteria for submitting a video can be found on the channel site or http://www.icmec.org. No incomplete or anonymous submissions will be accepted.
The International Centre for Missing & Exploited Children is a private, nonprofit, nongovernmental organization. It is the leading agency working on a global basis to combat child abduction and exploitation. It is the sister organization of the National Center for Missing & Exploited Children.
This project can also be duplicated here in the Philippines. Perhaps the Philippine government can join or create a website that will do the same. If videos are not available, then photographs will do. This is just one example, we can also setup a site where people can submit traffic violators, crimes, anomalies, and more – anything that can help improve government do its job. Let technology work for us, and not the other way around.
Apple debuts new iMac computers
by: Jerry Liao
Apple ignited the personal computer revolution in the 1970s with the Apple II and reinvented the personal computer in the 1980s with the Macintosh. Today, Apple continues to lead the industry in innovation with its award-winning computers, OS X operating system and iLife and professional applications. Apple is also spearheading the digital media revolution with its iPod portable music and video players and iTunes online store, and has entered the mobile phone market this year with its revolutionary iPhone.
Apple recently unveiled an all new all-in-one iMac line featuring gorgeous 20- and 24-inch widescreen displays encased in elegant and professional aluminum and glass enclosures. The entire new iMac line features the latest Intel Core 2 Duo processors and a new, ultra-thin aluminum Apple Keyboard, built-in iSight® video camera for video conferencing and iLife® ’08, making it the ultimate digital lifestyle desktop computer for both consumers and professionals. The 20-inch iMac now starts at just $1,199 (P 57,552.00), $300 (P 14,400.00) less than the previous 20-inch model, and the 24-inch iMac starts at just $1,799 (P 86,352.00) $200 (P 9,600.00) less than the previous 24-inch model.
“This new iMac is the most incredible desktop computer we’ve ever made,” said Steve Jobs, Apple’s CEO. “Our new design features the innovative use of materials, including professional-grade aluminum and glass, that are highly recyclable.”
Redefining Apple’s signature all-in-one design, the new iMac integrates the entire computer system into a sleek, professional aluminum enclosure for a striking, clutter-free desktop. An elegant glass cover joins precisely to the aluminum enclosure creating a virtually seamless front surface. The new iMac’s 20- and 24-inch glossy widescreen displays provide incredibly crisp images, ideal for photos and movies using the all new iLife ’08 suite of digital lifestyle applications that are included. The new ultra-thin aluminum Apple Keyboard is just 0.33 inches thin at its front edge. A new optional Apple Wireless Keyboard is a compact design that, with Apple’s wireless Mighty Mouse, offers a cable-free desktop.
Packing professional performance into the convenience of an all-in-one design, the new iMac includes the latest Intel Core 2 Duo processors running up to 2.8 GHz with 4MB of shared L2 cache and up to 4GB of 667 MHz DDR2 SDRAM memory. The iMac line features ATI’s next generation of graphics with the ATI Radeon HD 2600 PRO with 256MB of GDDR3 memory and the ATI Radeon HD 2400 XT with 128MB of GDDR3 memory. The new iMac now offers up to 1TB of internal storage to accommodate a user’s growing library of digital photos, movies and music.
Providing the latest in high-performance connectivity options to quickly and conveniently transfer digital photos, music and video, the iMac includes built-in AirPort Extreme® 802.11n Wi-Fi networking, delivering up to five times the performance and twice the range of 802.11g;* Gigabit Ethernet; a total of five USB 2.0 ports (including two on the new Apple Keyboard); and one FireWire® 400 and one FireWire 800 port.
The new iMac, with its stunning design, features highly recyclable and durable materials including scratch-resistant glass and professional grade aluminum. The power-efficient iMac also meets the stringent new Energy Star 4.0 requirements.
Every iMac also includes iLife ’08, the most significant update ever to Apple’s award-winning suite of digital lifestyle applications, featuring a major new version of iPhoto® and a completely reinvented iMovie®, both seamlessly integrated with the new .Mac Web Gallery for online photo and video sharing. The new iMac also comes with the world’s most advanced operating system, Mac OS® X version 10.4.10 Tiger, including Safari™, Mail, iCal®, iChat AV, Front Row and Photo Booth.
Build-to-order options and accessories include: a 2.8 GHz Intel Core 2 Extreme processor, up to 4GB DDR2 SDRAM and up to a 1TB Serial ATA hard drive on the 24-inch iMac; up to 4GB DDR2 SDRAM and up to 750GB Serial ATA hard drive on the 2.4 GHz 20-inch iMac; and up to 4GB of DDR2 SDRAM and up to 500GB Serial ATA hard drive on the 2.0 GHz 20-inch iMac. Additional options include: new Apple Wireless Keyboard and wireless Mighty Mouse; AirPort Express® and AirPort Extreme Base Station (now with Gigabit Ethernet); the AppleCare Protection Plan; and pre-installed copies of iWork ’08, Logic Express 7, Final Cut Express HD 3.5 and Aperture 1.5.
Secret to Toddler Vocabulary Explosion Revealed
by: Jerry Liao
“Goo-Goo Gaa-Gaa” – just some of the words uttered by our babies when they’re still young. Whatever that means, it surely brings a smile to every parents. Parents even mimics this so-called “Baby Talk” just to communicate with their angels. How about the hidden competition whether the baby will first utter the word “mama” or “papa”?
But when the baby reaches the age of 1 or so, his or her ability to talk suddenly doubles or triples. They will no longer say “mum-mum” if they’re thirsty. They will say “tubig (water)” or “gatas (milk)”. Finally, a research was conducted to determine the reasons behind this and here’s the explanation:
Researchers have long known that at about 18 months children experience a vocabulary explosion, suddenly learning words at a much faster rate. They have theorized that complex mechanisms are behind the phenomenon. But new research by a University of Iowa professor suggests far simpler mechanisms may be at play: word repetition, variations in the difficulty of words and the fact that children are learning multiple words at once.
“The field of developmental psychology and language development has always assumed that something happens at that point to account for this word spurt: kids discover things have names, they switch to using more efficient mechanisms and they use their first words to help discover new ones,” said Bob McMurray, assistant professor of psychology in the UI College of Liberal Arts and Sciences. “Many such mechanisms have been proposed.”
McMurray writes that children may still engage those specialized mechanisms. But a series of computational simulations that he conducted suggest that simpler explanations – such as the repetition of words over time, the fact that children learn many words at the same time and the fact that words vary in difficulty – are sufficient to account for the vocabulary explosion.
“Children are going to get that word spurt guaranteed, mathematically, as long as a couple of conditions hold,” McMurray said. “They have to be learning more than one word at a time, and they must be learning a greater number of difficult or moderate words than easy words. Using computer simulations and mathematical analysis, I found that if those two conditions are true, you always get a vocabulary explosion.”
McMurray’s simulations are analogous to a series of jars of different sizes, each representing a word, with more difficult words represented by larger jars. As individual units of time passed, a chip is dropped into each jar. Once the jar is filled, the word is learned.
McMurray’s mathematical analysis suggests that the word spurt is largely driven by the number of small jars (easy words) relative to large jars (difficult words). As long as there are more difficult words than easy ones, the vocabulary explosion is guaranteed.
Few words in any language are used an overwhelming number of times in ordinary speech. So, if frequency of use is considered as a measure of degree of difficulty, languages have many more difficult than easy words, McMurray said.
Experts have long thought that once a child learns a word, it is easier for him or her to learn more words. Or in the case of McMurray’s simulation, the jars become smaller. But McMurray also simulated a model in which the jars became larger once a word was learned and found that the vocabulary explosion still occurred.
“If we see the same word spurt when we model the inverse of accepted thinking, then clearly the specialized mechanisms aren’t necessary,” he said. “Our general abilities can take us a lot farther than we thought.”