Tampilkan postingan dengan label Sains and Tech. Tampilkan semua postingan
Tampilkan postingan dengan label Sains and Tech. Tampilkan semua postingan

Selasa, 27 Oktober 2009

The TV You Want Today

When you stroll into your local store to shop for a new TV, dozens of big, glossy screens will greet you, each one trying to draw you in with its bright, colorful pictures. And a bewildering bevy of new features promise a multitude of benefits. Which ones will make a difference in what you watch and in how it looks on the screen? We'll help you sort out what's important. (If you haven't bought an HDTV before, see our previous article package on setting up an HDTV.)

The Changing World of Television

It's official: The (television) world is flat. The market has just about completed the transition from the large, heavy, cube-shaped, standard-definition CRT (picture-tube) television set to the sleek, thin, light, high-definition flat-panel set. According to market research firm DisplaySearch, worldwide shipments of flat-panel televisions shifted from about 5 percent of all sets in early 2004 to nearly 75 percent of the total last spring. In terms of revenue, flat panels now account for more than 90 percent of the worldwide television market. The Consumer Electronics Association says that 52 percent of U.S. households have an HDTV today. And now that the digital transition is complete, HDTV adoption continues apace.

For many shoppers, this year's television purchase may bump a previously purchased HDTV down to some other area of the house, such as the kitchen or a bedroom. But whether this is your first HDTV set or your third, it pays to get a model that's packed with all of the latest features. You'll likely find some eye-popping HDTV deals this holiday season, but don't expect prices to plummet, even if HDTV prices today are 20 percent lower than they were last year. According to DisplaySearch analyst Paul Semenza, the LCD panels used in HDTVs are actually getting more expensive. So far, prices of models on store shelves haven't reflected this shift--but it could inhibit the deep discounts that often appear during a holiday buying season.

Another recent trend is full-resolution HDTVs: All but a few entry-level, low-cost models with screen sizes greater than 40 inches have the 1920-by-1080 resolution of "full HD" (1080p). At smaller screen sizes, 720p remains common: On Best Buy's Web site, I found that 18 of the 35 sets between 30 and 39 inches were 720p, including models from LG, Panasonic, Samsung, and Sony.

40" Samsung LN40B650T1f: It's the priciest 40-inch model on our Top 5 chart of 40-to-42-inch HDTVs, but this television offers first-rate picture quality, along with a slew of ports for Internet features and other network connections, all while being very usable.
Beyond entry-level sets, today's HDTVs differentiate themselves with features that can enhance your viewing experience and improve the TV's performance. Several capabilities--such as fast-motion response times and LED backlighting--that were once exclusive to super-pricey high-end models are now showing up in more-affordable mainstream units. But what do these new features mean, and will they make a noticeable difference in your viewing? Which features are merely nice to have, and which ones are worth paying extra money to get?

(Note: For this feature overview and our latest roundup of HDTVs, the PC World Labs developed a new, up-to-date suite of tests, described in "How We Test HDTVs.")

Rising Refresh Rates

According to DisplaySearch, about half of all LCD HDTVs with 40-inch or larger screens now have refresh rates of 120Hz or higher. It took a couple of years for 120Hz to reach the mainstream, but today only entry-level and economy models at these large sizes have the standard 60Hz refresh rate. The picture changes for sets under 40 inches, though: DisplaySearch says that among all such LCDs shipped in the second quarter of 2009, only 14 percent were capable of 120Hz. The company expects that figure to grow to 24 percent by the year 2013.

Some manufacturers have made a full-on push to 120Hz. Sony, for example, has only one series--the Bravia S5100--that doesn't have 120Hz or 240Hz models.

Note the emphasis here on LCDs (versus plasma screens): Since LCDs have the lion's share of the flat-panel market at more than 90 percent, it makes sense that they get most of the attention. But LCD technology has a known issue with fast motion, stemming from the fact that it relies on moving tiny molecules around to block or to transmit the light from the panel's backlight. And these molecules need time to move from one position to another. As a result, traditional panel designs ran into a problem with motion blur. Commonly, the leading and trailing edges of a fast-moving object in an image looked soft, an unwelcome artifact--and not just for hockey fans trying to follow a speeding puck on the screen.

Refresh Rates, Continued

Panel manufacturers found that changing the cell structure and the formulations of liquid crystal material wasn't enough to overcome this problem: The trick was to refresh the image twice as often, doubling the refresh rate from 60 to 120 times a second. In addition, manufacturers improved their televisions' controlling circuitry so that it would look at the two original frames in the 60Hz image stream, and interpolate a new frame to provide an intermediate image.

This approach produced a marked improvement over traditional 60Hz sets, one that's well worth the extra investment in a set with a minimum of 120Hz. The price difference has narrowed, but you can still expect to pay approximately $100 to $200 more to step up to an HDTV set with this feature in the 40- to 42-inch range. It's a must-have feature if you plan to watch sports, but any content that includes panning scenes and fast action will benefit from this technology.

If 120Hz is good, then 240Hz must be twice as good, right? The answer is a lot murkier than that. The manufacturers that offer 240Hz refresh technology are divided into two camps, each with a different approach. Samsung and Sony use a true 240Hz technology, in which (as in 120Hz sets) the controller starts with a pair of frames from the 60Hz content stream--but then creates three additional intermediate frames, not one. This means that for each of the original frames, the set actually displays four frames. (The math changes for 24p signals, such as those piped out by Blu-ray Disc players, but the concept is similar.)

Adding these extra frames causes the liquid crystal material to move more quickly than it otherwise would, which in turn reduces the blur effects. The difference may be noticeable compared with 120Hz, but in our tests it wasn't as dramatic as the difference between 60Hz and 120Hz, even when we looked at the sets side by side. As such, 240Hz is probably not worth paying a lot more for over the cost of a 120Hz model. (Right now, the jump from 120Hz to 240Hz is about $300 to $600, a large premium compared with the step from 60Hz to 120Hz.)

LG 42LH50
LG approaches 240Hz by a different path: Its models with 240Hz performance generate one interpolated frame for each of the standard 60Hz frames, just as 120Hz models do, but they flash their backlights twice for every frame. Thus, 60 original frames plus 60 more interpolated frames make 120 frames, and then the backlight flashing twice for each frame yields 240 flashes per second. Like a strobe light in a disco dance hall, the flash of the backlight helps freeze the action on the screen and reduce motion blur. But this eye trick still presents only 120 frames per second, so asserting that its re­­fresh rate is faster than 120Hz rests on rather shaky science.

Panels with 120Hz (or faster) refresh rates have one additional benefit: Most television programming is recorded at 30 frames per second (fps), which is easy to double for the 60Hz refresh rate that most HDTVs have used. Movies, however, are filmed at 24 fps, which poses problems for technicians seeking to digitize them for DVD or broadcast formats. To fit the 30-fps timing, every four frames of movie film must be stretched to fit five frames of video. The process employed to achieve this, called "3:2 pulldown," uses two interlaced fields of the first film frame and then three interlaced frames of the next frame to produce the stretch.

This awkward conversion can create a motion artifact called "judder," a jerkiness or slight stutter visible in the finished image. But since 120Hz, unlike 60Hz, is an even multiple of 24, these panels can display 24-fps material without requiring any conversion; each frame just gets shown five times.

Note also that plasma does not have this problem. The individual plasma pixels can actually turn on and off much faster than an LCD pixel; in fact, Panasonic has taken to describing their panels as "600 Hz". Plasma needs this extra speed because a pixel can only be on or off, and so must be turned on and off rapidly during each frame of an image to create the correct shade of color. So in theory, motion blur is not a problem for plasma models. However, the PC World Labs' tests showed that, at least in the case of the two plasma models we tested from Panasonic--the TC-PS461 and the TC-P42X1--some plasma sets could learn a thing or two from LCDs.

LED Backlighting

Another feature growing in prominence is the use of LEDs as backlights for LCD TV panels. Both Samsung and Toshiba call their models with this feature "LED TVs," which has confused many consumers. LED TVs are not a new technology; they're just LCD TVs with a different type of backlight. LEDs have some distinct advantages over traditional cold-cathode fluorescent lamp (CCFL) designs, which many LCD TVs use. Compared with CCFL technology, LED backlighting re­­sults in TVs that require less power (by up to 40 percent for a 40-inch television) and provide improved color performance: LED TVs handle red and green hues better, resulting in more-natural, more-lifelike picture quality.

46" Samsung LN46B750U1F: The top-ranked model on our chart of larger high-definition sets, this 46-inch HDTV earned plaudits for superb picture quality, Internet connectivity, and an array of functions on its full-featured remote control.
Perhaps the foremost advantage of LED backlighting is its ability to enhance contrast and produce darker blacks. This capability closes the gap between LCD screens and plasma displays, which traditionally have offered deeper blacks than LCDs. Here again, manufacturers have adopted two different approaches to implementing LED backlights. One design puts the LED lights behind the LCD panel in a big matrix layout. This approach permits the use of "local dimming": If the controller recognizes that a portion of the image is generally dark, it can automatically dim the LEDs behind that one small segment of the image. This helps keep black levels low, increasing the apparent contrast.

The other way to use LEDs with LCD panels is to put them along the edge behind the panel, as Samsung's 1.2-inch-thick, 46-inch UN46B8000 does. This approach requires sophisticated diffusers to spread the light evenly behind the LCD layer, and it reduces or eliminates the ability to improve apparent contrast through localized, content-based dimming. It does keep the part count much lower, however, and it can make the heat that the LEDs generate easier to manage.

Though LEDs have their benefits, they come with their own issues. For one thing, LED TVs cost appreciably more than CCFL-based models--about $300 more, on average--due both to the cost of manufacturing the LEDs and to the cost of installing these arrays. Also, LED production processes cannot yet make units with consistent color output, so manufacturers must inspect each LED and "bin" it--grouping it with other LEDs of similar color. The more consistent and accurate the color output required, the more the individual LEDs cost. Until the industry solves this problem, LCD TVs with LED backlights are likely to cost more than CCFL sets.

If you care about color quality and are willing to tweak your television, an LED-backlight model may be worth the extra money. Even at a $300 premium, you would be paying less than $30 a year extra over the set's expected lifetime, or less than $2.50 a month. We've seen some lovely images produced by LED-backlit HDTVs--on the Samsung LN-A950, for example. Note, however, that none of the models we tested for this roundup included LED backlights.

Connectivity

Another big trend this year involves connections to bring the Internet to your TV. Many HDTVs have an ethernet connection on the back, plus integrated software for dealing with Web content. If you connect to the Internet via your home network's router, your TV can gain access to a range of Web-based content, all without going through a computer.

According to data from Nielsen, 90 percent of U.S. homes now have access to broadband Internet connections, so the connected TV is entirely feasible with today's technology, especially if the HDTV set has a Wi-Fi capability, as many do.

The TVs shipping today that have Web access limit the locations you can visit online. This approach simplifies navigation, which is important because you have to use a remote control instead of a keyboard, mouse, and full-on Web browser.

This year marks the debut of Yahoo Connected TV's Widgets; the Widgets are now offered on Internet-capable sets from LG, Samsung, Sony, and Vizio. The services and presentation vary from one brand to another, as manufacturers make different choices about which Widgets to offer. Widgets are available for news, weather, and sports information, as well as for access to popular sites like YouTube, Twitter, Flickr, and Facebook.

Yahoo Widgets isn't the only Internet connectivity going. Sony has continued to develop its Bravia Internet Video Link version of streaming, Web-based content--which includes modules for Amazon Video on Demand, CBS, Netflix, Slacker Internet Radio, YouTube, and more. Other sets, such as units developed by Panasonic and by LG Electronics, have modules for services as well; the most popular inclusions are Amazon Video on Demand, Netflix and Vudu, along with photo sharing sites such as Flickr and Picasa, and streaming music Websites like Pandora and Slacker Internet Radio.

Network connectivity can give you access to the content you've stored on your home network, including CDs you've ripped, digital photos, and digital home movies. Some sets can access those items, as well. The Digital Living Network Alliance certifies most connected TVs; put your files on a DLNA-certified storage device on your network, and a DLNA television will be able to play your music and screen your photos and videos. According to the Alliance's Website (www.dlna.org) more than 500 DLNA-certified television models are available. Windows Media Player 11 and 12 are DLNA servers too, so using a PC that runs XP, Vista, or Windows 7 will work, if you use WMP 11 or 12 for your media library.

Integrated ethernet has another advantage: Upgrades to your television's software can download automatically, so the updated firmware, or a new Widget or other service, will be available the next time you turn on your HDTV.

Network connectivity will give you access to an enormous amount of additional content--much of it on-demand and free--but do some research before plunking down your cash if you want specific services or capabilities. Having an ethernet connection doesn't automatically mean that an HDTV will stream media through your home network, or have all the Web services you seek.

Going ‘Green'?

Consumer electronics are going green, and the HDTV market is no exception to the trend. While plasma flat panels continue to consume more electricity than LCD models, both technologies have made notable strides in energy conservation.

For example, "eco modes" dim the picture to save energy when the viewer doesn't need full brightness, working in much the same way as a draft mode on a printer. Automatic ambient light sensors can adjust an image's brightness, saving energy. And plasma manufacturers have developed some more-efficient technologies that reduce the power a plasma television consumes.

Most manufacturers are not shy about touting their "greenness," whenever possible. To check for lower power consumption, look for the Energy Star 3.0 logo. To qualify for this optional program, run jointly by the U.S. Environmental Protection Agency and the U.S. Department of Energy, an HDTV must not exceed a specified maximum power consumption limit. This limit differs somewhat depending on the set's screen size, and the Energy Star program breaks out HDTVs into three segments: smaller than 40 inches, 40 inches to 58 inches, and larger than 58 inches. (Click for a list of models that qualify for Energy Star logos.)

The next revision of Energy Star, Version 4.0, is due to take effect on May 1, 2010, and Version 5.0 will replace it on May 1, 2012. These new specifications further reduce the maximum allowable amount of energy that a qualifying set can use.

Other industry groups are promoting their own energy consumption guidelines and logos. The LCD TV Association's "Green TV" initiative currently requires that a TV have an ambient light sensor that will automatically adjust the screen's brightness in response to room lighting conditions; dimming the screen in darkened rooms helps save energy (more stringent guidelines are in the works). So far, only LG Electronics has qualified its televisions--four lines, to be precise--for this logo. Of course, the "Green TV" logo alone doesn't determine whether a TV has this capability: Sony's VE5 series of televisions has an ambient light sensor, too, among other eco-friendly modes.

Meanwhile, the California Energy Commission (CEC) has proposed legislation dictating its own energy consumption limits for televisions. Differing from optional programs like Energy Star and Green TV, the CEC's proposed regulations would be mandatory. Under the CEC's proposal, it would be illegal to sell nonconforming products in the state of California. (Because California is so big, such regulations could afffect sets nationwide, as well.)

The CEC has stated plans to make a final decision about the specifications--which would likely place limits on the maximum operating power consumption for all flat-panel televisions, as well as other modes such as standby--this November. All flat-panel TVs--especially larger plasmas--could be impacted by this legislation; some models meet the requirements, and others do not. Meanwhile, plasma TVs remain, on average, less-expensive than same-size LCD TVs (a 50-inch plasma can be about $300 less than a comparable LCD).

According to some sources, consumers can save $15 to $30 a year on their electricity bills by choosing a set with lower power consumption. This may not seem like a big deal, but consider that the average U.S. consumer keeps a television for ten years or longer. A savings of $150 to $300 on a set that originally cost $500 to $1000 is a significant amount.

Whither OLED?

What about the new, revolutionary flat-panel technologies--such as OLED--that you may have heard of and that are supposed to arrive "any day now"? The answer is that they're not coming any time soon, at least not a way that will have any significant impact at present on your buying decision.

In recent years, we've had an alphabet soup of new technologies, including FED and SED, paraded around at fancy demos. These promising advances have fizzled for many reasons, but perhaps the unrelenting decline in LCD and plasma prices is most to blame for their failure to come to market. A set that was targeted to sell for $2500 a few years ago must now sell for under $1000, wiping out any profit that would have been made at the higher price. As LCD and plasma manufacturers continue to improve production efficiencies and economies of scale, it becomes increasingly difficult for new display technologies to gain a foothold.

Most Promising (Maybe): OLED

The most promising technology that might yet catch on remains organic light-emitting displays (OLEDs). OLEDs are emissive like a picture-tube (CRT) television, which eliminates issues with viewing angles; and the technology is fast and highly responsive, so it lacks motion blurring. OLED displays are incredibly thin; the entire display is just a thin layer on the back of a sheet of glass (or plastic or other substrate). Blacks are deep and endless, colors are stunning, and the whole thing requires very little power.

Sony has been selling the XEL-1 OLED TV for a couple years now, and it is the first and (still) the only OLED TV on the market. (OLED screens are used primarily in mobile devices such as cell phones and media players like the Microsoft Zune HD.) The problem with the XEL-1 is that it's about one-sixteenth the size of a 42-inch flat panel, yet costs two to four times as much. It measures only 11-inches diagonal, or about the size of a netbook screen, so it's barely large enough for a personal TV. It also carries a mere 960-by-540-pixel resolution, so it's not even high definition. And with a $2500 price tag, it's impossible to get excited about the value proposition.

While many companies have set (and missed) delivery dates for larger-format OLED HDTVs, the only company with anything promised is Samsung, which plans to sell a 15-inch, 720p-capable model in Korea this year. Samsung hasn't yet announced pricing, but some sources have predicted that they will be in line with the Sony model. Other sets are rumored for delivery in 2010, but as past experience has shown, don't believe they're coming until they hit the store shelves. While small-screen production for mobile devices is proceeding well, manufacturers are finding it more difficult than expected to transfer that experience to manufacturing the larger panels necessary for an HDTV. The fabrication costs are higher and the yields lower than hoped, and this makes it nearly impossible to make a TV-sized panel at a competitive price. At least for now.

So today, your choices are LCD and plasma for flat panels. LCD is just about the only choice for sets smaller than 40 inches diagonal. Plasma continues to have a price advantage in sets 40 inches and larger, so compare carefully when choosing between the two technologies.

Beyond the Core

Manufacturers continue to find ways to differentiate their products. For example, most flat-panel TVs now have extensive settings that allow the image to be adjusted to optimal settings for a given room or piece of content (video or film, say). Some will even store different configurations for day and night viewing. In the past, these were locked up so that only qualified technicians could access them, and a professional calibration service could cost hundreds, or even thousands, of dollars. Now, most people buy and install their flat panel themselves--so manufacturers are opening up the advanced configuration settings to the end user.

Note, however, that such do-it-yourselfers can get lost in the maze of settings and end up with an image that is far less than optimal. Get the edge enhancement settings too far off, and objects on the screen can develop comic-book outlines. Misadjust the motion compensation settings, and you can introduce noticeable artifacts (even though you'll fix other, also noticeable, artifacts). If you do change any of the settings, make sure you know how to get back to the factory defaults in case you get it hopelessly entangled. You can also find DVDs that will provide test images and instructions that will help you adjust your HDTV's settings with more precision than just eyeballing it.

Another feature area is the quality of the sound the set can produce. Some can simulate multi-channel surround sound, while others just have simple stereo speakers (and not always of outstanding quality at that). Some models, such as the Toshiba REGZA XV648 series, include volume-leveling features that even out the quiet and loud parts of programming. Toshiba uses technology from Dolby for this, though other companies offer similar technology. If you're serious about sound, you'll still want a separate surround sound setup.

A feature coming to sets in the near future is 3D capability; for a discussion, see "3DTV: The Next Big Thing?"

Final Word

Some of the newfangled features we've discussed here will help you experience your entertainment content differently, or improve how it looks on screen. In general, as you try to decide what to buy, resist being dazzled by shiny, sparkly things. Focus instead on the attributes that will matter when you're watching various kinds of content on your new TV.

pcworld

Senin, 26 Oktober 2009

5 New Technologies That Will Change Everything

While sipping a cup of organically farmed, artisan-brewed tea, I tap on my gigabit-wireless-connected tablet, to pull up a 3D movie on the razor-thin HDTV hanging on the wall. A media server streams the film via a superspeedy USB connection to a wireless HD transmitter, which then beams it to the TV.

That actor--who was he? My augmented-reality contact lenses pick up the unique eye motion I make when I have a query, which I then enter on a virtual keyboard that appears in the space in front of me. Suddenly my field of vision is covered with a Web page showing a list of the actor's movies, along with some embedded video clips.

These technologies will come to life in the distant future, right? Future, yes. Distant, no.

Speed and content (much of it video) will be paired consistently across mobile, laptop, desktop, and home-entertainment systems. New ways of using video--including adding 3D depth or artificial visual overlays--will require more speed, storage, and computational power.

In our preview of technologies that are well on their way to reality, we look at the connective tissue of USB 3.0, 802.11ac, and 802.11ad for moving media--especially video--faster; at HTML5 for displaying video and content of all kinds consistently across all our devices; at augmented reality to see how the digital world will stretch into our physical reality by overlaying what we see with graphics and text; and at 3D TV, which will add image depth and believability to the experience of watching TV.

USB 3.0

USB 3.0

The new USB 3.0 standard preserves backward compatibility by allowing older cables to plug into newer jacks; but newer cables like this one have extra pins that boost the data rate to 4.8 gbps.
Before you leave work, you need to back up your computer. You push a button, and 5 minutes later, while you're still packing up, your system has dumped 150GB of data onto an encrypted 512GB superfast solid-state drive, which you eject to take with you for offsite backup. On your way home, you stop at a movie kiosk outside a fast-food restaurant and buy a feature-length 3D video download on sale. You plug in your drive, the kiosk reads your credentials, and while you watch a 90-second preview of coming attractions, the 30GB video transfers onto your SSD. You pull out the drive and head home.

USB may be one of the least-sexy technologies built into present-day computers and mobile devices, but speed it up tenfold, and it begins to sizzle. Cut most of the other cables to your computer, and the standard ignites. Bring in the potential of uncompressed video transfer, and you have a raging fire.

Any task that involves transferring data between your PC and a peripheral device--scanning, printing, or transferring files, among others--will be far faster with USB 3.0. In many cases, the transfer will be complete before you realize it has started.

The 3.0 revision of USB, dubbed SuperSpeed by the folks who control testing and licensing at the USB Implementors Forum (USB-IF), is on track to deliver more than 3.2 gigabits per second (gbps) of actual throughput. That transfer rate will make USB 3.0 five to ten times faster than other standard desktop peripheral standards, except some flavors of DisplayPort and the increasingly out-of-favor eSATA.

In addition, USB 3.0 can shoot full-speed data in both directions at the same time, an upgrade from 2.0's "half duplex" (one direction at a time) rates. USB 3.0 jacks will accept 1.0 and 2.0 plug ends for backward compatibility, but 3.0 cables will work only with 3.0 jacks.

This technology could be a game-changer for device connectivity. A modern desktop computer today may include jacks to accommodate ethernet, USB 2.0, FireWire 400 or 800 (IEEE 1394a or 1394b) or both, DVI or DisplayPort or both, and--on some--eSATA. USB 3.0 could eliminate all of these except ethernet. In their place, a computer may have several USB 3.0 ports, delivering data to monitors, retrieving it from scanners, and exchanging it with hard drives. The improved speed comes at a good time, as much-faster flash memory drives are in the pipeline.

USB 3.0 is fast enough to allow uncompressed 1080p video (currently our highest-definition video format) at 60 frames per second, says Jeff Ravencraft, president and chair of the USB-IF. That would enable a camcorder to forgo video compression hardware and patent licensing fees for MPEG-4. The user could either stream video live from a simple camcorder (with no video processing required) or store it on an internal drive for later rapid transfer; neither of these methods is feasible today without heavy compression. Citing 3.0's versatility, some analysts see the standard as a possible complement--or even alternative--to the consumer HDMI connection found on today's Blu-ray players.

The new USB flavor could also turn computers into real charging stations. Whereas USB 2.0 can produce 100 milliamperes (mA) of trickle charge for each port, USB 3.0 ups that quantity to 150mA per device. USB 2.0 tops out at 500mA for a hub; the maximum for USB 3.0 is 900mA.

With mobile phones moving to support USB as the standard plug for charging and syncing (the movement is well underway in Europe and Asia), and with U.S. carriers having recently committed to doing the same, the increased amperage of USB 3.0 might let you do away with wall warts (AC adapters) of all kinds.

In light of the increased importance and use of USB in its 3.0 version, future desktop computers may very well have two internal hubs, with several ports easily accessible in the front to act as a charging station. Each hub could have up to six ports and support the full amperage. Meanwhile, laptop machines could multiply USB ports for better charging and access on the road. (Apple's Mac Mini already includes five USB 2.0 ports on its back.)

The higher speed of 3.0 will accelerate data transfers, of course, moving more than 20GB of data per minute. This will make performing backups (and maintaining offsite backups) of increasingly large collections of images, movies, and downloaded media a much easier job.

Possible new applications for the technology include on-the-fly syncs and downloads (as described in the case study above). The USB-IF's Ravencraft notes that customers could download movies at the gas pump at of a filling station. "With high-speed USB [2.0], you couldn't have people waiting in line at 15 minutes a crack to download a movie," Ravencraft says.

Manufacturers are poised to take advantage of USB 3.0, and analysts predict mass adoption of the standard on computers within a couple of years. The format will be popular in mobile devices and consumer electronics, as well. Ravencraft says that manufacturers currently sell more than 2 billion devices with built-in USB each year, so there's plenty of potential for getting the new standard out fast.

Video Streaming Over Wi-Fi

Video Over Wi-Fi

Today's Wi-Fi will be left in the dust by 802.11ac and 802.11ad, both of which will be capable of carrying multiple video streams and of operating at far higher data rates.
When you get home--with your high-def, 3D movie stored on a flash drive--you plug the drive into your laptop and transfer it to your network file server over a gigabit Wi-Fi connection. A couple of minutes later, the movie is ready to stream via a 60GHz wireless link from your networked entertainment center to your wall-mounted HDTV.

Wired ethernet has consistently achieved higher data speeds than Wi-Fi, but wireless standards groups are constantly trying to figure out ways to help Wi-Fi catch up. By 2012, two new protocols--802.11ac and 802.11ad--should be handling over-the-air data transmission at 1 gbps or faster.

As a result, future users can have multiple high-definition video streams and gaming streams active across a house and within a room. Central media servers, Blu-ray players, and other set-top boxes can sit anywhere in the home, streaming content to end devices in any location. For example, an HD video display, plugged in with just a power cord, can stand across the room from a Blu-ray player, satellite receiver, or computer--no need for expensive, unsightly cables.

The 802.11ac and 802.11ad standards should be well suited for home use, though their applications will certainly extend far beyond the home. The names reflect the internal method of numbering that the engineering group IEEE uses: 802 for networking, 11 for wireless, and one or more letters in sequence for specific task groups (that's how we got 802.11a, b, g, h, n, and others).

The 802.11ac standard will update 802.11n, the latest and greatest of a decade's worth of wireless local area networking (WLAN) technology that began with 802.11b. With 802.11ac, wireless networking performance will leap from a theoretical top speed of 600 mbps to a nominal maximum of more than 1 gbps. In practice, the net data carried by 802.11ac will be likely be between 300 mbps and 400 mbps--up from 160 mbps or so for a good real-world 802.11n setup, and more than enough capacity to carry multiple compressed video streams over a single channel simultaneously. Or users may assign individual streams running on unique frequencies to a number of separate channels. Like 802.11n, 802.11ac will use many antennas for receiving and sending data wirelessly.

The 802.11ac flavor still won't have the capacity to carry lossless high-definition video (video that retains the full fidelity and quality of the raw source), however. Today, lossless video is common over wired connections after decompression or decoding of a data stream from a satellite, cable, or disc. The right hardware will be able to take the 802.11ac compressed data stream and send it directly to a decoder in an HDTV set; some HD sets already have this capability today. But when uncompressed video has to stream at a rate faster than 1 gbps, a speedier format must be used.

That's where 802.11ad comes in. It abandons the 2.4GHz and 5GHz bands of the spectrum (where today's Wi-Fi works) to the newly available 60GHz spectrum. Because the 60GHz spectrum has an ocean of frequencies available in most countries--including in the United States--you'll be able to use multiple distinct channels to carry more than 1 gbps of uncompressed video each.

Unfortunately, the millimeter-long waves that make up 60GHz signals penetrate walls and furniture poorly, and oxygen readily absorbs the waves' energy. So 802.11ad is best suited for moving data across short distances between devices in the same room. Apart from supporting fast video transfers, 802.11ad will permit you to move files or sync data between devices at speeds approaching that of USB 3.0--and 1000 times faster than Bluetooth 2.

The 802.11ad spec is one of three competing ideas for using the 60GHz band of the spectrum. The Wireless HD trade group, a consortium of consumer electronics firms, is focusing on video use of the 60GHz band, while the Wireless Gigabit Alliance (WiGig) is looking at networking and consumer uses. Membership in the various groups overlaps, making an interoperable and perhaps unified spec possible. Though 802.11ad doesn't specifically address video, it will be a generic technology that can accommodate many kinds of data. At a minimum, each group will work to prevent interference with one another's purposes.

The combination of 802.11ac and 802.11ad, coupled with USB 3.0, will allow you to position clusters of computer equipment and entertainment hardware around your home. USB 3.0 and gigabit ethernet might connect devices located in a cabinet or on a desk; 802.11ac will link clusters across a home; and 802.11ad will carry data to mobile devices, displays, and other gear within a room.

Allen Huotari, the technical leader at Cisco Consumer Products (which now includes Linksys products and ships millions of Wi-Fi and ethernet devices each year) says that the change in home networks won't result from "any one single technology in the home, but rather a pairing of technologies or a trio of technologies--wired and/or wireless--for the backbone and the wireless on the edges."

This means fewer wires and cables, better speeds, and higher-quality video playback than anything possible today. By 2012, both specifications should be readily available.

3D TV

3D TV

Panasonic and other high-definition TV makers are looking to faux 3D technology to provide stereoscopic depth--and a reason for consumers to buy a newer set.
Disconnecting your active-shutter 3D glasses from a charger, you slip them on, eager to check out your downloaded copy of Hulk VI: Triumph of the Stretch Fabrics,the latest entrant in the green antihero's film franchise. You drop into a comfy chair, tell the kids it's time for a movie, and twist the heat pouch on a bag of popcorn to start it popping. The kids grab their own glasses and sit down to watch the Hulk knock the Predator practically into their laps!

When television makers introduced HDTVs, it was inevitable that they would figure out a way to render the technology obsolete not long after everyone bought a set. And they have. The next wave in home viewing is 3DTV--a 2D picture with some stereoscopic depth.

As 3D filmmaking and film projection technology have improved, Hollywood has begun building a (still small) library of depth-enhanced movies. The potential to synthesize 2D movies into 3D could feed demand, however--the way colorizing technology increased interest in black-and-white films in some circles in the 1980s. For movies based on computer animation--such as Toy Story 3D, a newly rendered version of the first two movies in the series--it's already happening.

The promise of 3D is a more immersive, more true-to-life experience, and substantively different from almost anything you've watched before. In commercial theaters, 3D projection typically involves superimposing polarized or distinctly colored images on each frame and then having viewers wear so-called "passive" glasses that reveal different images to each eye. The brain synthesizes the two images into a generally convincing notion of depth.

In contrast, 3D at home will almost certainly rely on alternating left and right views for successive frames. HDTVs that operate at 120Hz (that is, 120 cycles of refresh per second) are broadly available, so the ability to alternate left and right eye images far faster than the human eye can follow already exists. Fundamental industry standards are in place to allow such recording, says Alfred Poor, an analyst with GigaOm and the author of the Web site HDTV Almanac.

Viewing 3DTV displays will require "active" glasses that use rapidly firing shutters to alternate the view into each eye. Active glasses are expensive today, but their price will drop as 3D rolls out. Meanwhile, designers are in the development phase of producing a 3D set that doesn't require the glasses.

Sony and Panasonic have announced plans to produce 3D-capable displays, and Panasonic recently demonstrated a large-screen version that the company expects to ship in 2010. As happened when HDTVs rolled out, premium 3DTVs will appear first, followed by progressively more-affordable models.

Creating and distributing enough 3D content to feed consumers' interest may be more of an challenge. Poor noted that filmmakers are currently making or adapting only a handful of features each year for 3D. But techniques to create "synthetic 3D" versions of existing films (using various tracking, focus, and pattern cues for splitting images) could fill the gap.

Existing terrestrial cable and IPTV networks should be able to distribute 3D content. The bandwidth that such networks use to deliver typical HD broadcasts will be adequate for delivering 3D video once the networks upgrade to newer video compression techniques. Satellite may face a more difficult road, since such systems already use the best levels of compression.

For physical media playback, Blu-ray can store the data needed, and 3D Blu-ray players are already on the drawing board. No fundamental changes in Blu-ray will be necessary, so the trade group that created the standard is focusing compatibility--such as ensuring that a 2D TV can play a 3D disc.

Standards issues might not end up being very troublesome, so long as the 3DTVs are flexible enough. An industry group is working on setting some general parameters, much as digital TV was broken up into 480, 720, and 1080 formats, along with progressive and interlaced versions. A 3DTV may need to support multiple formats, but all will involve alternating images and a pair of shutter-based glasses.

Poor expects that 3DTV will be but a minor upgrade to existing HDTV sets. The upgraded sets will need a modified display controller that alternates images 60 per second for each eye, as well as an infrared or wireless transmitter to send synchronization information to the 3D glasses.

"Augmented Reality" in Mobile Devices

Augmented Reality

Babak Parviz, a professor at the University of Washington specializing in nanotechnology, is working on a bionic contact lens that would paint imagery and information directly on the eye to augment reality.
You enjoyed Hulk VI so much on your home theater setup that you decided to see it on the big screen. The movie is still playing, but you’re not sure how to find the movie theater where it’s playing. In the old days, you might have printed out directions from MapQuest; but nowadays you don't need to do anything so primitive. Instead, you dock your smartphone on the dashboard as you slip into your car, and instantly it superimposes driving directions to the theater are superimposed on your car's windshield. As you approach your destination, you see a group of tall buildings. Superimposed on the windshield over one of the buildings is the building’s name, the name of the movie theater inside it, the name Hulk VI, and a countdown to show time. "Turn left in 100 yards," the navigator speaks through your stereo as a large turning arrow appears, guiding you into the parking structure.

In Neal Stephenson's book Snow Crash, "gargoyles" are freelance intelligence gatherers who have wired themselves to see (through goggles that annotate all of their experiences) a permanent overlay of data on top of the physical world. In less immersive fashion, we may all become gargoyles as “augmented reality” becomes an everyday experience.

Augmented reality is a catchall term for overlaying what we see with computer-generated contextual data or visual substitutions. The point of the technology is to enhance our ability to interact with things around us by providing us with information immediately relevant to those things.

At work, you might walk around the office and see the name and department of each person you pass painted on them--along with a graphical indicator showing what tasks you owe them or they owe you. Though many case scenarios involve “heads-up” displays embedded in windshields or inside eyeglasses, the augmented reality we have today exists primarily on the “heads-down” screens of smartphones.

Several companies have released programs that overlay position- and context-based data onto a continuous video camera feed. The data comes from various radios and sensors built into modern smartphones, including GPS radios (for identifying position by satellite data), accelerometers (for measuring changes in speed and orientation), and magnetometers (for finding position relative to magnetic north).

In an application called Nearest Places, the names and locations of subway stops, parks, museums, restaurants, and other places of interest are shown on top of an iPhone's video feed. As you walk or turn, the information changes to overlay your surroundings.

"Smartphones and the related apps are the trailblazers for augmented reality," says Babak Parviz, a professor at the University of Washington who specializes in nanotechnology. "In the short to medium term, my guess is that they will dominate the field."

Other prototype applications display information dropped at particular coordinates as 3D models that the user can walk around, or as animations whose details update in 3D relative to the user's position. But the technology for those apps isn't ripe yet; handhelds require a more-precise positioning mechanism in order to handle that kind of data insertion. Fortunately, each smartphone generation seems to include more and better sensors.

In other realms, augmented reality may serve to provide not just additional information, but enhanced vision. One day, infrared cameras mounted on the front of a car will illuminate a far-away object represented as a bright-as-day image on an in-windshield display. Radar signals and wireless receivers will detect and display cars that are out of sight; and one piece of glass will host GPS and traffic reporting.

Leaping past displays, Parviz and his team are working on ways to put the display directly on the eyeball. They’re trying to develop a technology for embedding video circuitry into wearable contact lenses. While wearing such contact lenses, you would see a continuous, context-based data feed overlaid on your field of vision.

Before Parviz's lenses become a reality, augmented reality is likely to become a routine navigation and interaction aid on mobile devices. In addition, game developers may use the technology to overlay complete digital game environments over the reality that gamers see around them.

HTML5

Web browsers

Web pages built with HTML5 will display the same on any browser--desktop or mobile.

Hulk VI was great, but w hat should you watch this evening? Before heading off to work in the morning, you click to some trailers on a movie Website, but you don't have time to watch many. So you use your mobile phone to snap a picture of the 2D barcode on one of the videos; the phone's browser then takes you to the same site. On the commuter train to the office, you watch the previews over a 4G cell phone connection. A few of the movies have associated games that you try out on your phone, too.

Remember when every Website had a badge that read "optimized for Netscape Navigator" or "requires Internet Explorer 4"? In the old days, people made Web pages that worked best with--or only with--certain browsers. To some extent, they still do.

The new flavor of the HTML--the standard program for writing Web pages--is called HTML5 (Hypertext Markup Language version 5); and HTML5 aims to put that practice to bed for good.

Specifically, HTML5 may do away with the need for audio, video, and interactive plug-ins. It will allow designers to create Websites that work essentially the same on every browser--whether on a desktop, a laptop, or a mobile device--and it will give users a better, faster, richer Web experience.

Instead of leaving each browser maker to rely on a combination of its in-house technology and third-party plug-ins for multimedia, HTML5 requires that the browser have built-in methods for audio, video, and 2D graphics display. Patent and licensing issues cloud the question of which audio and video formats will achieve universal support, but companies have plenty of motivation to work out those details.

In turn, Website designers and Web app developers won't have to deal with multiple incompatible formats and workarounds in their efforts to create the same user experience in every browser.

This is an especially valuable advance for mobile devices, as their browsers today typically have only limited multimedia support. The iPhone’s Safari browser, for example, doesn't handle Adobe Flash--even though Flash is a prime method of delivering video content across platforms and browsers.

"It'll take a couple of years to roll out, but if all the browser companies are supporting video display with no JavaScript [for compatibility handling], just the video tag and no plug-in, then there's no downside to using a mobile device," says Jeffrey Zeldman, a Web designer and leading Web standards guru. "Less and less expert users will have better and better experiences."

Makers of operating systems and browsers appear to be falling into line behind HTML5. Google Chrome, Apple Safari, Opera, and WebKit (the development package that underlies many mobile and desktop programs), among others, are all moving toward HTML5 support.

For its part, Microsoft says that Internet Explorer 8 will support only parts of HTML5. But Microsoft may not want to risk having its Internet Explorer browser lose more market share by resisting HTML5 in the face of consensus among the other OS and browser makers.

HTML5 is now completing its last march toward a final draft and official support by the World Wide Web Consortium.

Microsoft unwraps netbook Windows 7 upgrade tool

Microsoft has released a tool that lets netbook owners install Windows 7 on their machines using a USB flash drive, sidestepping the usual requirement of a DVD drive.

The utility, Windows 7 USB/DVD Download Tool, creates a bootable flash drive from a downloaded .iso file, or disk image, of Windows 7, and can be purchased from Microsoft's online store.

"This tool allows you to create a copy of the .iso file to a USB flash drive or a DVD," said Microsoft in the instructions accompanying the tool. "To install Windows 7 from your USB flash drive or DVD, all you need to do is insert the USB flash drive into your USB port or insert your DVD into your DVD drive and run Setup.exe from the root folder on the drive."

The USB/DVD Download Tool solves the problem facing netbooks users who want to upgrade to Windows 7, since virtually all netbooks lack a DVD drive. Earlier this year, rumors circulated that Microsoft might offer Windows 7 upgrades on a flash drive, but the talk turned out to be nothing but wishful thinking.

Users need a 4GB USB drive to install Windows 7 on a PC without an optical drive, Microsoft said. Other requirements include .NET Framework 2.0 or later, and the ability to run as administrator on the to-be-upgraded netbook.

The netbook's BIOS must also be modified to set the boot order so that the USB drive is first on the list. "Please see the documentation for your computer for information on how to change the BIOS boot order of drives," Microsoft recommended.

Last Thursday, Microsoft warned users to seek help if they were unfamiliar with tweaking the BIOS. "If you are not comfortable making this type of BIOS change, I recommend you seek some assistance from your favorite 'tech geek,'" Microsoft spokesman Brandon LeBlanc urged in an entry to the Windows 7 blog.

Because most netbooks run Windows XP, only a "clean" upgrade to Windows -- Microsoft dubs it "Custom" during the installation -- is possible. That requires users to back up data and application settings before upgrading, then restore the data and settings, as well as reinstall all applications.

computerworld

Minggu, 25 Oktober 2009

Apple Terancam Bayar 'Upeti' US$ 1 M ke Nokia


Nokia menggugat Apple terkait tuduhan pelanggaran hak paten teknologi di iPhone. Jika Nokia memenangkan gugatan hukum ini, Apple kemungkinan akan pusing kepala karena diperkirakan, mereka harus membayar Nokia sampai sejumlah US$ 1 miliar.

Dalam gugatannya, Nokia mengklaim Apple melanggar 10 paten dalam teknologi seperti transfer data nirkabel. Neil Mawston selaku analis di Strategy Analytics menyatakan, jika terbukti bersalah, Apple mungkin harus membayar antara US$ 200 juta sampai US$ 1 miliar karena paten digunakan di sekitar 34 juta iPhone.

Nokia sendiri memang termasuk pemegang paten kunci di ranah teknologi mobile, di samping Qualcomm dan Ericsson. Ini membuat posisi Nokia cukup kuat dalam melayangkan gugatan pada pihak Apple.

"Hampir tak dapat dibayangkan sebuah pihak dapat memproduksi ponsel tanpa menggunakan paten teknologi milik Nokia," tukas Ben Wood, Direktur Riset di CSS
Insight.

Nokia sendiri menyatakan bahwa sebenarnya, mereka sudah berusaha menawarkan jalan damai dengan syarat Apple mau membayar paten milik Nokia. Namun vendor asal Amerika Serikat tersebut tidak menyetujuinya.

Nokia mengklaim semua produk iPhone, termasuk versi original dan 3G, menggunakan teknologi paten kepunyaan perusahaan Finlandia ini. Demikian yang dilansir Reuters dan dikutip detikINET, Minggu (25/10/2009).

detik

Kamis, 22 Oktober 2009

Steve Ballmer Confirms Blu-ray Drives for the Xbox 360

One of the advantages that the PlayStation 3 has publicized since the beginning of the console battle of this generation has been the presence of an integrated Blu-ray drive, which allows for bigger capacity disks to be used and for the game console to play high-quality movies.

The Xbox 360 has tried to counter by backing the HD DVD format, but, with that no longer supported, Microsoft has long been rumored to be preparing to get a Blu-ray unit for the Xbox 360.

Now, Steve Ballmer, who is the chief executive officer of Microsoft, has told Gizmodo as part of an exclusive interview that Blu-ray is a definite possibility for the Xbox 360 home-gaming console. When asked about getting the format on the Microsoft device he has said, “Well I don't know if we need to put Blu-ray in there – you'll be able to get Blu-ray drives as accessories.” In other words, Microsoft does not intend to integrate the high-capacity drive in the console itself, but it might offer an add-on like the HD DVD one, which it sold for a while with little success.


A Microsoft spokesperson has then clarified Ballmer's comments by stating that, “Our immediate solution for Blu-ray-quality video on an Xbox 360 is coming this fall with Zune Video and 1080p instant-on HD streaming. As far as our future plans are concerned, we're not ready to comment.”

Ballmer clashed with the heads of the Xbox division in the past about the possibility of integrating Blu-ray. The CEO has been quoted as hinting the drive could be added to the device at some point, considering that HD DVD is all but dead, but, on every occasion other Microsoft bosses have said that Ballmer is overenthusiastic in his comments.

Harga Windows 7 Rp 900 Ribu - 3,5 Juta

Microsoft boleh saja mengklaim telah memberikan harga khusus untuk Windows 7 di Indonesia. Namun harga yang ditawarkan sepertinya masih tetap merogoh kocek user cukup dalam.

Lukman Susetio, Windows Client Product Manager Microsoft Indonesia menjelaskan, harga Windows 7 untuk Indonesia dimasukkan ke dalam kategori emerging market alias negara berkembang.

Dari keempat varian OS ini yang ditawarkan, yakni Windows 7 versi Home Basic, Home Premium, Profesional dan Ultimate, harga yang dibanderol mulai dari Rp 900 ribu. "Sementara untuk versi Ultimate antara Rp 2 juta - 3,5 juta," ujarnya kepada beberapa wartawan di sela peluncuran Windows 7 dan Windows Phone yang berlangsung di Hard Rock Cafe Jakarta, Kamis (22/10/2009).

Harga tersebut, kata Lukman, terbilang lebih murah jika dibandingkan dengan negara lain yang dikelompokkan ke dalam mature market. Selisihnya bisa mencapai 10-20 persen.

"Bahkan kalau Anda beli di online store seperti Amazon misalnya, harga jualnya bisa lebih mahal karena bisa jadi harga yang ditetapkan untuk mature market," tukasnya.

Pun demikian, Chief Operating Officer Microsoft Indonesia Faycal Bouchlagem mengaku juga mempunyai paket harga khusus bagi sejumlah kalangan penggunanya. Seperti untuk kalangan pendidikan, pemerintah, hingga enterprise. Namun berapa harga paket khusus tersebut tak diungkapkannya.


detik

Minggu, 18 Oktober 2009

Utilization Of DVD ROMs By Computer Game Developers

Computer games are a hit with all generations of people. Before the invention of computer games, there were the video consoles like Sega, Dreamcast, Gameboy and others.


The video consoles were only a hit with the younger generation. Due to the poor quality, DOS like graphics, they could not spread among all of age groups. Programmers were limited by the lack of storage space. The utilization of DVD-ROMs by computer game developers opened a whole lot of ideas to the game developers.


Computer Games are no longer limited to a tiny amount of storage space, which prevents the usage of high quality graphics. The fan following that the latest computer games generate is tremendous. The graphics from the time of Wolfenstein has improved by leaps and bounds. Right now, there is Halo 3 for Xbox 360; which will shortly be available for the PC platform. The game is packed with high definition graphics, improved AI, a superior lighting engine, new artillery, characters, and confrontation.


Earlier, in the computer motherboard there was a VGA slot which was limited to 4x speed. The newer PCI express slots replaced it. The PCI express slots could support up to 16x speed. The single PCI express graphics card has been replaced by the SLI system that boasted of dual graphic cards with identical specification.


The utilization of DVD-ROMs by computer game developers has enabled concepts like Pixel Shader, Hardware Texture & Lighting technology, Vertex Shader and Stream Processors that greatly enhance the quality of what we see. The Microsoft Corporation has bundled Direct X 10 along with its Windows Vista, the next generation operating system. It is expected to support the next generation of graphics and sound.


Game developers are nowadays not limited to making games for CDs only. The scripting for the object oriented programming or OOP has greatly been modified to suit the hardware that is coming out to meet the needs of a hardcore gamer.


Three-dimensional games are of immensely better quality than the likes of say Unreal Tournament, 2002. Unreal Tournament revolutionized the First Person Shooter genre with its Unreal Game Engine. It supported a lot of system configuration, yet produced high quality graphics on lower end machines. The size of the game was just over 600 MB. The new Unreal Tournament 3, which has hit the market at the end of the year 2007, is sized over 4 GB, with installation being double.


Nowadays the utilization of DVD-ROMs by computer game developers has dumped the CDs for game packaging. Single layer DVD has become an industry standard. However, game developers are putting their games on dual size or double-layered DVDs. The high performance game engines are taking a lot of space to perform. They are bundled with lot of things in additional to tools that run the game. The developers now have the luxury to spread their creativity without having to think about output storage systems.


Another industry standard construction is the half-life and the quake game engines. Half-life was the first FPS game, which took the market by storm with its superior game play and then incredible graphics. It went on to become the game of the year. Quake also revolutionized the team game along with its counterpart the Unreal Engine.


The game engines are taking the multi threading technology of the processors to the max. The games, which are sized almost a DVD, eats up at least double the size in the hard disk. Earlier a 700 MB game took around 500 MB to 1400 MB of Hard Disk space for installation. Now a 2 DVD game can eat up a whole partition of the hard disk.

Departemen ESDM Siapkan Migrasi ke Open Source

Pada tanggal 15-16 Oktober 2009 bertempat di Jogjakarta, teman-teman dari Departemen Energi Sumber Daya Mineral (DESDM) yang di motori oleh teman-teman di Pusat Data dan Informasi (PUSDATIN) ESDM menyelenggarakan workshop untuk sosialisasi IT dengan titik berat open source. Workshop ini di maksudkan untuk mempersiapkan implementasi open source software di Departemen ESDM di masa depan semoga menjadi kenyataan di kabinet mendatang.

Peserta yang terlibat ada sekitar 75 orang yang merupakan pengelola teknologi informasi di dalam lingkungan Departemen ESDM. Mereka berdatangan dari beberapa kota, seperti, Cepu, Jogja, Bandung, Jakarta dan Ombilin.

Pembicara yang terlibat adalah Ibu Lolly DEPKOMINFO, Pak R. Santoso IPTEKNET, PUSILKOM UGM, dan Onno W. Purbo. Teman-teman dari DEPKOMINFO, IPTEKNET, PUSILKOM UGM lebih memfokuskan pada pentingnya open source & keuntungannya. Kisah seperti kemudahan, bebas virus, keamanan jaringan terutama keamanan data menjadi bahan diskusi yang menarik. Pak Toto (R. Santoso) dari IPTEKNET menjelaskan dengan berkali-kali bahkan open source bukan berarti tidak aman, justru sebaliknya bahkan sangat aman karena semua orang jadi dapat memperbaiki dengan mudah jika ada bug atau kesalahan software.

Ibu Lolly (DEPKOMINFO) menjelaskan betapa pentingnya open source untuk mengefisienkan penggunaan anggaran belanja negara, supaya instansi pemerintah tidak perlu lagi menduplikasi pembelian software, cukup mengalokasikan dana untuk developer lokal di Indonesia untuk sebuah aplikasi dan langsung dapat digunakan untuk semua instansi di Indonesia.

Tidak heran, open source juga akan memungkinkan industri dalam negeri menjadi berkembang. Tidak hanya membuat Industri software luar negeri menjadi kaya.

Sementara Onno W. Purbo kebetulan mendapat jatah siang hari sampai magrib, lebih memfokuskan pada demo yang sifatnya hands on dilakukan secara langsung oleh salah satu peserta yang belum pernah menggunakan open source software sama sekali. Tidak pernah terbayangkan sebelumnya bahwa untuk menggunakan open source software tidak perlu di install ke PC, hanya dari live CD atau liveDVD bisa langsung jalan.

Di akhir-akhir waktu demo, di perlihatkan cara membuat server FTP, server file maupun Distro SchoolOnffLine. Termasuk yang diterangkan adalah cara membuat sendiri sebuah Distro menggunakan Ubuntu Customization Kit (UCK). Bagi mereka yang sering membuat Distro dibutuhkan waktu sekitar 2 jam-an untuk membuat sendiri sebuah Distro menggunakan UCK.

Beberapa komentar yang masuk dari teman-teman peserta antara lain:


Rosalin, PUSLITBANG Ketenaga Listrikan DESDM: "Ubuntu oke. Tapi Server yang berbasis command prompt based nyerah. Ada baiknya dilakukan technical training untuk Administrator di Unit khususnya untuk Server."

Donny, PUSDATIN: "Merupakan pengalaman pertama kali mengoperasikan Open Source Software. Ternyata sangat mudah."

Melihat antusias yang demikian tinggi, mudah-mudahan melapangkan jalan ESDM maupun institusi pemerintah lainnya dalam melakukan migrasi ke software open source dengan mudah di tahun 2010-2012.

Selasa, 30 Juni 2009

Tinta Non-Original tidak Merusak Printer?

Kontroversi penggunaan tinta non-original (kompatibel, refill) versus tinta original tidak akan pernah berhenti. Produsen printer inkjet (Canon, Epson, HP, Lexmark) mengklaim, penggunaan tinta non-original tidak memberikan hasil cetak seindah, setajam, seawet tinta original. Mereka juga mewanti-wanti, tinta non-original, khususnya refill (isi-ulang), dapat merusak printer. Garansi printer akan lenyap, begitu ancam mereka, kalau terbukti penggunaan tinta non-original merusak printhead printer.

Di lain pihak, jumlah pemain tinta non-original berkembang pesat bak jamur di musim hujan. Di tanah air, ini bisa kita lihat dari makin tersebarnya gerai-gerai tinta isi ulang. Gerai-gerai ini tidak lagi hanya ada di mal-mal komputer, tetapi juga wilayah pemukiman di berbagai kota.

Acaciana misalnya, punya 26 gerai pusat isi-ulang, termasuk di Bontang, Makassar, Pekan Baru, dan Serang. Venetta System bahkan punya lebih banyak lagi gerai – sekitar 100 - di penjuru tanah air. Lombok, Palu, dan bahkan Papua tak luput dari rambahan pemain tinta/ribbon isi-ulang untuk printer inkjet, laser dan dot-matrix tersebut.

Pemain lain, X-Fill, tak kalah gesit membesarkan bisnis tinta non-original. Buka sekitar dua tahun lalu, saat ini mereka juga sudah menyebar ke Batam dan Bali. Sementara itu e-print beroperasi antara lain di Nabire, Biak, Kendari, Bangka dan Kupang.

Kehadiran tinta non-original bisa dilihat di setiap pameran komputer. Tidak ada pameran komputer di tanah air yang tidak menampilkan pemain tinta pihak ketiga (non-original). Pada ajang FKI di Jakarta (10 – 13 Mei 2009), pemain tinta pihak ketiga, mulai dari Acaciana, e-print, X-Fill, Venetta System, sampai pemain baru yang murni berjualan tinta pihak-ketiga dalam partai besar (bulk – botolan), tampak.

Selisih Harga
Populernya tinta non-original utamanya karena besarnya selisih harga produk pihak-ketiga dengan harga tinta OEM (original equipment manufacturer) dari para produsen printer. Ambil contoh tinta HP45 yang digunakan oleh printer lawas HP DeskJet 970cxi dan 6122.

Oleh reseller tinta original di pameran FKI Jakarta, kemasan cartridge tinta hitam tersebut dibandrol dengan harga Rp 311 ribu. Sementara itu tinta tiga-warna HP78 pasangannnya ditawarkan pada harga Rp 278 ribu. Total kita harus mengeluarkan dana Rp 589 ribu untuk menggunakan tinta original di printer HP tersebut.

Padahal jika ‘berani’ menggunakan tinta pihak ketiga, Anda bisa menghemat sampai 80%. Acaciana misalnya, menawarkan tinta refill untuk warna hitam dan warna yang masing-masing dijajakan Rp 35 ribu, atau tinta kompatibel (Acaciana menyebutnya sebagai isi-ulang) warna seharga Rp 150 ribu dan tinta hitam seharga Rp 125 ribu. Paket tinta yang sama – refill –ditawarkan lebih murah oleh X-Fill: Rp 40 ribu untuk HP45 dan Rp 50 ribu untuk HP78, ditambah diskon 50%.

Penghematan juga berlaku untuk pengguna printer Epson. Tinta original T38 dan T39 untuk Epson Stylus C41SX/43SX/45/CX1500 dijual masing-masing Rp 55 ribu dan Rp 100 ribu. Padahal kompatibel dari X-Fill dijajakan masing-masing Rp 25 ribu dan Rp 40 ribu.

Kualitas dan Garansi
Masalahnya, beranikah Anda menggunakan tinta pihak-ketiga? Kalau printer Anda sudah berumur dan garansinya sudah habis, mungkin tidak ada salahnya Anda mencoba. Ini khususnya jika Anda lebih banyak mencetak teks dibandingkan foto/gambar.

Menurut trustedreviews, banyak uji independen menunjukkan bahwa tinta kompatibel hitam secara konsisten memberikan hasil yang setara dengan cartridge original. Cuma kalau Anda menginginkan cetakan foto berkualitas prima, usahakan tetap menggunakan tinta original, plus kertas (foto) yang sesuai. Sebab para vendor OEM sudah melakukan uji ketat untuk memastikan tinta dan kertas yang mereka keluarkan memberikan hasil optimal jika digunakan bersamaan.

Namun satu yang tetap harus Anda perhatikan. Pilihlah tinta non-original yang punya reputasi bagus. Sebab tidak semua cartridge non-original berkualitas sama. Yang harganya sangat murah memang menggiurkan, apalagi jika aktivitas cetak Anda tinggi. Namun murahnya harga itu bisa jadi akibat dari buruknya kualitas tinta (terlalu cepat kering sehingga menyumbat printhead) dan juga penggunaan casing plastik yang di bawah standar (sehingga berpotensi bocor).

Selain itu, pengujian juga menunjukkan bahwa hasil cetak – khususnya foto - dengan tinta kompatibel yang harganya sangat murah tidak bertahan lama. Dalam kurun kurang dari satu tahun, warna-warnanya memudar – kendati ini juga dipengaruhi oleh jenis kertas yang dipakai.

Benang merahnya, penggunaan tinta non-original, khususnya kompatibel, belum tentu merusak printer inkjet Anda, apalagi jika head merupakan bagian dari cartridge. Namun hati-hatilah jika head itu tidak bisa diganti. Ini semua dengan catatan, tinta tidak bocor.



BOKS: Angka Z: Penentu Kualitas Tinta
Menekan tombol “Print” memang mudah. Namun mendapatkan hasil cetak yang prima bukanlah perkara mudah. Tidak jarang hasil cetak printer inkjet Anda tercemar oleh bercak atau rembesan tinta. Biang keroknya, begitu menurut studi terbaru, adalah tinta yang tidak sama konsistensinya


Tahu dong kalau printer inkjet memiliki nozzle (inkjet) yang kecil. Nozzle-nozzle inilah yang menyemprotkan tinta ke kertas sewaktu kertas berjalan melalui nozzle. Idealnya, tinta membentuk tetesan (droplet) yang bulat sempurna ketika tinta disemprotkan dari inkjet, menyentuh kertas tepat di sasarannya.

Namun pembentukan tetesan juga dipengaruhi properti tinta. Densitas, tegangan permukaan, dan viskositas (kekentalan) bisa menghambat aliran (atau konsistensi). Jika besaran tetesan tidak tepat, alih-alih teks dengan garis yang tajam, maka akan muncul bercak/serat-serat (filamen). Demikian menurut para peneliti.

Rahasia Angka Z
Para peneliti sekarang menggunakan angka Z untuk menggambarkan tegangan permukaan dan viskositas dari tinta tertentu. Tinta dengan Z yang lebih rendah lebih kental, sedangkan tinta dengan angka Z yang lebih tinggi memiliki tegangan permukaan yang lebih besar. Begitu jelas ilmuwan material dan peneliti Jooho Moon dari Yonsei University di Seoul, Korea, seperti dikutip dari science news.

Penelitian teoritis menyebutkan, tinta yang paling mudah mencetak memiliki nilai Z antara 1 dan 10. Namun penelitian Moon dkk 3 Maret lalu mengatakan, besaran tetesan terbaik terbentuk dari tinta bernilai Z di rentang 4 dan 14.

Menurut para peneliti yang menangkap gambar-gambar butiran tetesan dari berbagai jenis tinta yang dibuat di lab, tinta dengan nilai Z di atas 14 memiliki filamen-filamen (serat) yang mudah terpisah dari tetesan tinta, dan membentuk tetesan tinta kedua yang menciptakan rembasan (mblobor). Sedangkan tetesan dari tinta dengan nilai Z di bawah 14 lebih sempurna terbentuk, dan kekentalan tinta menarik/menahan filamen ke dalam tetesan. Namun tetesan dari tinta yang sangat kental dengan nilai Z di bawah 4 justru menempel ke inkjet, alias tidak tersemprotkan dengan sempurna.

Sayangnya penelitian ini masih terbatas pada satu printer. Selain itu nilai Z terbaik bisa bervariasi, tergantung pada seberapa jauh inkjet printer dari permukaan kertas.

TIPS: Mau Hasil Cetak Prima?

1. Usahakan untuk tidak menyimpan (menyetok) cartridge tinta inkjet terlalu lama. Sebab ada kemungkinanan saat dalam penyimpanan itu, tintanya akan mengering. Untuk urusan yang satu ini, kemasan cartridge dari para OEM (tinta asli) biasanya lebih stabil kendati disimpan. Ini khususnya tinta berbasis pigmen, yang tidak cepat kering. Kalau pun Anda terpaksa menyimpan cartridge tinta, tempatkan cartridge tersebut jauh dari paparan cahaya yang kuat, suhu lembab, dan polusi udara.

2. Gunakan kertas yang tepat saat mencetak. Kertas yang berpori-pori besar (instant dry) tidak akan memberikan hasil bagus jika dipadukan dengan tinta berbasis dye yang biasanya dipakai pada tinta kompatibel.


sumber: PCplus

Selasa, 23 Juni 2009

Selamat Tinggal Film KODACHROME



Namanya tersohor dan identik dengan kamera. Ia ada sejak 74 tahun lalu. Tahu dong siapa yang dimaksud? Ya, Kodak.

Sayang perusahaan ini ternyata tidak tangkas berpacu dengan kemajuan teknologi di era serba digital ini. Saat ini orang jarang menggunakan rol film untuk memotret. Maka film warna KODACHROME pun segera – tahun ini - akan menjadi sebuah kenangan. Begitu pengumuman resmi dari Kodak.

KODACHROME mendulang sukses saat diperkenalkan pertama kali pada tahun 1935. Namun belakangan permintaan pasar turun tajam. Ini tidak aneh mengingat semakin banyak fotografer yang beralih ke tipe film yang lebih baru dan juga menggunakan media kartu memori, bukan rol film, saat memotret. Saat ini, penjualan KODACHROME Film bahkan tidak mencapai 1% dari total penjualan film gambar-diam Kodak.

Sebagai bagian dari tribute bagi KODACHROME Film, Kodak akan menyumbangkan rol-rol film terakhirnya ke George Eastman House International Museum of Photography and Film di Rochester. Museum ini memiliki koleksi kamera terbesar di dunia, dan juga artefak terkait. Salah satu rol film terakhir akan digunakan oleh McCurry, dan hasil bidikannya disumbangkan ke Eastman House.

Untuk mengenang sejarah rol film, Kodak telah menciptakan sebuah galeri dari imaji-imaji ikon, termasuk gadis Afganistan dan foto-foto McCurry lain, maupun foto-foto dari para fotografer profesional lain seperti Eric Meola dan Peter Guttman. Silakan kunjungi situs Web-nya di www.kodak.com/go/kodachrometribute.

source : kompas

Senin, 22 Juni 2009

Dirancang Antigempa, Tahan 100 Tahun


Berdirinya Jembatan Suramadu merupakan tonggak sejarah baru dalam pembangunan konstruksi prasarana perhubungan di Indonesia. Jembatan antarpulau sepanjang 5.438 meter yang akan diresmikan besok (10/6) itu bukan hanya yang terpanjang di Indonesia, tetapi juga di Asia Tenggara.

Sebagai jembatan yang menghubungkan dua pulau, sesungguhnya Suramadu (Surabaya-Madura) merupakan jembatan kedua setelah rangkaian jembatan Barelang (Batam-Rempang-Galang) yang selesai dibangun tahun 1997. Enam jembatan dengan berbagai tipe yang menghubungkan tujuh pulau kecil di Provinsi Kepulauan Riau ini merupakan landmark keberhasilan dan kemandirian anak bangsa dalam membangun jembatan antarpulau.

Sebelum Suramadu dibangun, timbul keraguan, apakah mungkin membangun jembatan di daerah patahan dan gempa? Bagaimana dengan tiupan angin di laut Selat Madura yang terkenal kencang, apakah tidak akan memengaruhi konstruksi jembatan?

Penelitian pun akhirnya dilakukan secara mendalam selama 2003-2004. Penelitian yang lebih bersifat technical study dilakukan terhadap 12 item yang kebanyakan berupa parameter tanah.

Dari sisi seismic hazard analysis, misalnya, diperoleh kesimpulan, di sekitar lokasi jembatan tidak ditemukan suatu patahan aktif. Berdasarkan katalog gempa juga tidak ditemukan gempa dengan magnitude di atas 4 skala Richter sehingga kondisi di sekitar lokasi jembatan cukup stabil.

Kajian mendalam juga dilakukan terhadap kontur dasar laut, arus air laut, serta pengaruh pasang terhadap jembatan. Ternyata semuanya sangat memungkinkan untuk dibangun jembatan yang menghubungkan dua pulau. Adapun untuk angin, berdasarkan kajian, ternyata angin yang melintang kecepatannya sekitar 3,6 kilometer per jam sampai maksimal 65 kilometer per jam.

Tahan gempa

Jembatan Suramadu yang pemancangan tiang pertamanya dilakukan pada 20 Agustus 2003 oleh Presiden Megawati Soekarnoputri saat ini bisa tahan terhadap guncangan gempa sampai 7 skala Richter. Jembatan ini pun dirancang dengan sistem antikorosi pada fondasi tiang baja.

Karena menghubungkan dua pulau, teknologi pembangunan Jembatan Suramadu didesain agar memungkinkan kapal-kapal dapat melintas di bawah jembatan. Itulah sebabnya, di bagian bentang tengah Suramadu disediakan ruang selebar 400 meter secara horizontal dengan tinggi sekitar 35 meter.

Untuk menciptakan ruang gerak yang lebih leluasa bagi kapal-kapal, di bagian bentang tengah Suramadu dibangun dua tower (pylon) setinggi masing-masing 140 meter dari atas air. Kedua tower ini ditopang sebanyak 144 buah kabel penopang (stayed cable) serta ditanam dengan fondasi sedalam 100 meter hingga 105 meter.

”Total panjang tower sekitar 240 meter. Ini sesuatu yang belum pernah dilakukan sebelumnya,” kata Direktur Jenderal Bina Marga Departemen Pekerjaan Umum Hermanto Dardak.

Kuat 100 tahun

Secara keseluruhan, pembangunan Suramadu menghabiskan sekitar 650.000 ton beton dan lebih kurang 50.000 ton besi baja. Tak heran, dinas pekerjaan umum mengklaim Suramadu sebagai megaproyek yang menghabiskan dana total mencapai Rp 4,5 triliun. Jembatan ini dirancang kuat bertahan hingga 100 tahun atau hampir menyamai standar Inggris yang mencapai 120 tahun.

Karena berada di tengah lautan, Suramadu berpotensi terkendala faktor angin besar yang potensial terjadi di tengah lautan. Untuk memastikan keamanan kendaraan yang melintas di atas Suramadu, Departemen Pekerjaan Umum akan membangun pusat monitoring kondisi cuaca, khususnya angin.

”Jika kecepatan angin sudah mencapai 11 meter per detik atau sekitar 40 kilometer per jam, jembatan harus ditutup untuk kendaraan roda dua demi keselamatan pengendara,” ujar Menteri Pekerjaan Umum Djoko Kirmanto.

Jika kecepatan angin bertambah hingga 18 meter per detik atau sekitar 65 kilometer per jam, jalur untuk kendaraan roda empat akan ditutup. Langkah ini semata-mata untuk keselamatan dan kenyamanan pengendara. Adapun konstruksi jembatan akan tetap aman karena Jembatan Suramadu dirancang tetap kokoh meski ditempa angin berkecepatan lebih dari 200 kilometer per jam.

Bukan cuma kuat dari terpaan angin, Jembatan Suramadu juga didesain mampu menopang kendaraan sesuai standar as atau axle di daratan. Dengan demikian, Suramadu diperkirakan mampu menahan beban dengan berat satu as kendaraan sekitar 10 ton.

Cukup lima menit

Setelah diresmikan besok, diperkirakan Jembatan Suramadu akan dilintasi 8.000-9.000 sepeda motor per hari serta sekitar 4.000 kendaraan roda empat per hari.

Jumlah ini berdasarkan perhitungan sebelumnya, kendaraan yang melintasi Ujung-Kamal dengan menggunakan kapal feri sekitar 2,4 juta sepeda motor per tahun (62 persen) serta 1,5 juta kendaraan roda empat per tahun (38 persen).

Selain bakal padat, jembatan ini pun pasti akan sangat membantu masyarakat karena waktu tempuh Surabaya-Madura bisa dipersingkat. Jika sebelumnya menggunakan feri dibutuhkan waktu sekitar 30 menit, sekarang dengan menggunakan Suramadu cukup ditempuh lima menit.

Sempat tersendat

Pembangunan Suramadu dalam perjalanannya sempat menemui kendala dana. Terhambatnya pencairan dana menyebabkan pembangunan approach bridge atau jembatan pendekat sisi Surabaya sepanjang 672 meter tersendat September 2008. Pemerintah Provinsi Jawa Timur akhirnya menalangi dana pembangunan melalui Bank Jatim sebesar Rp 50 miliar sebelum dana pinjaman dari Bank Exim of China sebesar 68,9 juta dollar AS cair.

Studi pembangunan yang kurang sempurna menyebabkan perkiraan biaya pembangunan juga meleset, seperti tiang pancang jembatan yang awalnya hanya didesain setinggi 45 meter akhirnya bertambah menjadi sekitar 90 meter. Karena itu, dari estimasi awal nilai kontrak sebesar Rp 4,2 triliun, biaya pembangunan akhirnya membengkak hingga Rp 4,5 triliun.

Pembiayaan pembangunan Suramadu 55 persen ditanggung pemerintah dan 45 persen sisanya pinjaman dari China. Dari total biaya pembangunan Suramadu sebesar Rp 4,5 triliun, sekitar Rp 2,1 triliun di antaranya berutang kepada China. Mahalnya pemikiran dan biaya pembangunan Suramadu diharapkan mampu menumbuhkan geliat ekonomi Tanah Air, terutama Jawa Timur.

source : kompas