Monthly Archives: February 2011

What’s in HP’s Beats Audio, marketing aside?

If you are like me you may be wondering what is actually in Beats Audio technology, which comes from HP in partnership with Beats by Dr Dre.

The technical information is not that easy to find; but a comment to this blog directed me to this video:

http://www.precentral.net/what-beats-audio

image

According to this, it comes down to four things:

1. Redesigned headphone jack with better insulation, hence less ground noise.

image

2. Discrete headphone amp to reduce crosstalk. This is also said to be “more powerful”, but since we do not know what it is more powerful than, I am not going to count that as technical information.

3. Isolated audio circuitry.

4. Software audio profiles which I think means some sort of equalizer.

These seem to me sensible features, though what I would really like to see is specifications showing the benefits versus other laptops of a comparable price.

There may be a bit more to Beats audio in certain models. For example, the Envy 14 laptop described here has a “triple bass reflex subwoofer”.

image

though this user was not greatly impressed:

I ran some audio tone test sites and found out the built in laptop speakers do not generate any sound below 200 Hz. In the IDT audio drivers speaker config there is only configuration for 2 speaker stereo system, no 2.1 speaker system (which includes subwoofer). I’m miffed, because on HP advertising copy claims “HP Triple Bass Reflex Subwoofer amplifiers put out 12W total while supporting a full range of treble and bass frequencies.” Clearly I am not getting “full range” frequencies.

Still, what do you expect from a subwoofer built into a laptop?

Straining to hear: the benefits of SACD audio

A discussion on a music forum led me to this SACD, on which pianist George-Emmanual Lazaridis plays the Grandes études de Paganini. It was recommended as a superb performance and a superb recording.

image

I bought it and have to agree. The music is beautiful and the recording astonishingly realistic. Close your eyes and you can almost see the piano hammers striking the strings.

Since this sounds so good, I took the opportunity to explore one of my interests: the audible benefits of SACD or other high-resolution audio formats versus the 16/44 resolution of CD.

I have set up a simple comparison test. While it is imperfect and would not pass scientific scrutiny, I report it as of anecdotal interest.

First I set my Denon SACD to its best quality, without any bass management or other interference with the sound.

Then I wired the analog output from Front Left and Front Right to one input on my amplifier, and the analog Stereo output to an external analog to digital converter (ADC). The ADC is set to 16/44. When played in SACD stereo mode, these two sets of analog outputs should be the same.

The output from the ADC is then connected to a digital input on the amplifier.

Now I can use the amplifier remote to switch between pure SACD, and SACD via an additional conversion to and from 16/44 sound, which in theory could be encoded on a CD.

At first I could just about tell which was which. The SACD sounded a little more open, with more depth to the sound. It was more involving. I could not describe it as a huge difference, but perhaps one that would be hard to do without once you had heard it. A win for SACD?

Then I realised that the output on the ADC was slightly too low; the SACD was slightly louder. I increased the volume slightly.

Having matched the volume more exactly, I could no longer tell the difference. Both sounded equally good.

I enlisted some volunteers with younger and sharper hearing than mine, but without positive results.

I am not going to claim that nobody could tell the difference. I also recognise that a better SACD player, or a better audio system, might reveal differences that my system disguises.

Still, the test is evidence that on a working system of reasonable quality, the difference is subtle at most. Which is also what science would predict.

The SACD still sounds wonderful of course; and has a surround sound option which a CD cannot deliver. I also believe that SACDs tend to be engineered with more attention to the demands of high-end audio systems than CDs, tailored for the mass market.

Against that, CDs are more convenient because you can rip them to a music server. Personally I rarely play an actual CD these days.

Don’t be fooled. 24-bit will not fix computer audio

Record producer Jimmy Iovine now chairman of Interscope and CEO of Beats by Dr Dre, says there are huge quality problems in the music industry. I listened to his talk during HP’s launch event for its TouchPad tablet and new smartphones.

“We’re trying to fix the degradation of music that the digital revolution has caused,” says Iovine. “Quality is being destroyed on a massive scale”.

So what has gone wrong? Iovine’s speech is short on technical detail, but he identifies several issues. First, he implies that 24-bit digital audio is necessary for good sound:

We record our music in 24-bit. The record industry downgrades that to 16-bit. Why? I don’t know. It’s not because they’re geniuses.

Second, he says that “the PC has become the de facto home stereo for young people” but that sound is an afterthought for most computer manufacturers. “No-one cares about sound”.

Finally, he says that HP working with, no surprise, his own company Beats by Dr Dre, has fixed the problem:

We have a million laptops with Beats audio in with HP … HP’s laptops, the Envy and the Pavilion, actually feel the way the music feels in the studio. I can tell you, that is the only PC in the world that can do that.

Beats Audio is in the Touchpad as well, hence Iovine’s appearance. “The Touchpad is a musical instrument” says Iovine.

I am a music and audio enthusiast and part of me wants to agree with Iovine. Part of me though finds the whole speech disgraceful.

Let’s start with the positive. It is true that the digital revolution has had mixed results for audio quality in the home. In general, convenience has won out over sound quality, and iPod docks are the new home stereo, compromised by little loudspeakers in plastic cabinets, usually with lossy-compressed audio files as the source.

Why then is Iovine’s speech disgraceful? Simply because it is disconnected from technical reality for no other reason than to market his product.

Iovine says he does not know why 24-bit files are downgraded to 16-bit. That is implausible. The first reason is historical. 16-bit audio was chosen for the CD format back in the eighties. The second reason is that there is an advantage in reducing the size of audio data, whether that is to fit more on a CD, or to reduce download time, bandwidth and storage on a PC or portable player.

But how much is the sound degraded when converted from 24-bit to 16-bit? PCM audio has a sampling rate as well as a bit-depth. CD or Redbook quality is 16-bit sampled at 44,100 Hz, usually abbreviated to 16/44. High resolution audio is usually 24/96 or even 24/192.

The question then: what are the limitations of 16/44 audio? We can be precise about this. Nyquist’s Theorem says that the 44,100 Hz sampling rate is enough to perfectly recapture a band-limited audio signal where the highest frequency is 22,500 Hz. Human hearing may extends to 20,000 Hz in ideal conditions, but few can hear much above 18,000 Hz and this diminishes with age.

Redbook audio also limits the dynamic range (difference between quietest and loudest passages) to 96dB.

In theory then it seems that 16/44 should be good enough for the limits of human hearing. Still, there are other factors which mean that what is achieved falls short of what is theoretically possible. Higher resolution formats might therefore sound better. But do they? See here for a previous article on the subject; I has also done a more recent test of my own. It is difficult to be definitive; but my view is that in ideal conditions the difference is subtle at best.

Now think of a PC or Tablet computer. The conditions are far from ideal. There is no room for a powerful amplifier, and any built-in speakers are tiny. Headphones partly solve this problem for personal listening, even more so when they are powered headphones such as the high-end ones marketed by Beats, but that has nothing to do with what is in the PC or tablet.

I am sure it is true that sound quality is a low priority for most laptop or PC vendors, but one of the reasons is that the technology behind digital audio converters is mature and even the cheap audio chipsets built into mass-market motherboards are unlikely to be the weak link in most computer audio setups.

The speakers built into a portable computer are most likely a bit hopeless – and it may well be that HPs are better than most – but that is easily overcome by plugging in powered speakers, or using an external digital to analog converter (DAC). Some of these use USB connections so that you can use them with any USB-equipped device.

Nevertheless, Iovine is correct that the industry has degraded audio. The reason is not 24-bit vs 16-bit, but poor sound engineering, especially the reduced dynamic range inflicted on us by the loudness wars.

The culprits: not the PC manufacturers as Iovine claims, but rather the record industry. Note that Iovine is chairman of a record company.

It breaks my heart to hear the obvious distortion in the loud passages during a magnificent performance such as Johnny Cash’s version of Trent Reznor’s Hurt. That is an engineering failure.

Tiny data projectors using Texas Instruments DLP chips

Remember when data projectors were huge and expensive, and had bulbs so delicate that you were not meant to move them for half an hour after switch off?

image

Things are different now. At Mobile World Congress You can hold an HD projector in the palm of your hand or build it into a mobile phone. The projectors I saw were based on DLP Pico chipsets from Texas Instruments, which contain up to 2 million hinge-mounted microscopic mirrors. If you add a light source and a projection lens, you get a tiny projector.

The obvious use case is that you can turn up at an ad-hoc meeting and show photos, charts or slides on the nearest wall, instead of huddling round a laptop screen or setting up an old-style data projector.

Amazon Kindle goes social with Public Notes, Twitter and Facebook integration

A free firmware update for Amazon’s Kindle ebook reader adds several new features, including an element of social networking.

The features are as follows:

  • Page numbers for easier referencing, for example in essays, reviews and discussions. Page numbers must be included in the digital book for this to work. It is not clear how many titles include them; Amazon just says “Many titles in the Kindle Store now include real page numbers”.
  • New newspaper and magazine layout with a “Sections & Articles” view. Each section has its own article list for easier browsing.

image

  • Public notes with Facebook and Twitter integration. This is the feature that makes Kindle reading social. You can attach notes to a passage and make them publicly viewable by other readers who choose to follow you, either on a note-by-note basis, or by making an entire book public through the Amazon website. You can also register a Facebook and Twitter account and have specific notes and ratings posted to those who follow you on those networks.

image

The advantage for Amazon is that these features should promote books through viral marketing.

It comes at an interesting time, since Apple’s new subscription rules may make it difficult for Amazon to continue supporting iPhone and iPad with free readers. Apple is insisting on a 30% cut of the revenue for all titles purchased through apps, forming a financial barrier for competitors to its own iBooks service.

If Amazon can cement loyalty to Kindle though social network integration, that could help it maintain market share.

 

My question to the Gorilla Glass folk: when is Apple going to call?

I had a brief chat with Corning, makers of Gorilla Glass, who were showing their wares at Mobile World Congress in Barcelona.

image

Gorilla Glass is exceptionally strong and makes sense for expensive devices with glass screens – like many of the smartphones and tablets that are hot right now.

According to Corning, it is strengthened through an ion-exchange process:

Ion exchange is a chemical strengthening process where large ions are “stuffed” into the glass surface, creating a state of compression. Gorilla Glass is specially designed to maximize this behavior.

The glass is placed in a hot bath of molten salt at a temperature of approximately 400°C. Smaller sodium ions leave the glass, and larger potassium ions from the salt bath replace them. These larger ions take up more room and are pressed together when the glass cools, producing a layer of compressive stress on the surface of the glass. Gorilla Glass’s special composition enables the potassium ions to diffuse far into the surface, creating high compressive stress deep into the glass. This layer of compression creates a surface that is more resistant to damage from everyday use.

Fair enough; and I am a fan because it works. My question though: when is Apple going to call? The iPhone 4 has glass panels both front and rear, and is unfortunately rather fragile. I would be interested to know what proportion of damaged iPhones simply have shattered glass. One drop onto a hard surface is all it takes, unless you have a protective case that adds bulk and in my view spoils the design.

Apple’s Bumper case fixes the problem with the antenna design, but does little to protect the glass.

The iPad glass is also prone to shatter if dropped:

It slipped off my lap in a bar in the Portland airport during a particularly long layover. It landed screen-side down on the uneven Mexican tile floor and made a sound that caused the whole room to go quiet. I still feel a little sick just remembering it. It looked a lot like the ones people purposely ran over with trucks when I picked it up.

From what I can tell, Gorilla Glass really is better in this respect.

So how about it Apple?

First look at HP’s TouchPad WebOS tablet

I took a close look at HP’s WebOS TouchPad tablet during Mobile World Congress in Barcelona.

This 9.7” machine looks delightful. One of its features is wireless charging using the optional Touchstone accessory. The same technology can also transmit data, as mentioned in this post on wireless charging, and the TouchPad makes use of this in conjunction with new WebOS smartphones such as the Pre3 and the Veer. Put one of these devices next to a TouchPad and the smartphone automatically navigates to the same URL that is displayed on the TouchPad. A gimmick, but a clever one.

image

From what I saw though, these WebOS devices are fast and smooth, with strong multitasking and a pleasant user interface. Wireless charging is excellent, and a feature you would expect Apple to adopt before long since it reduces clutter.

I still would not bet on HP winning big market share with WebOS. The original Palm Pre was released to rave reviews but disappointing sales, and HP will have to work a miracle to avoid the same fate.

Wireless power at Mobile World Congress: no more chargers?

At Mobile World Congress Fulton Innovation was showing off its wireless power technology called eCoupled. We are accustomed to the idea of transmitting data wirelessly, but less familiar with wireless power. It is possible though, and I saw several examples. One of the most striking but least useful is this cereal box, printed with conductive ink, which lights up when placed on a special shelf – the inset image shows the same packet before the title lit up.

image

The technology has plenty of potential though. I travelled to Barcelona with a case full of chargers, and the idea of simply placing them on a charging shelf instead is compelling; this is already possible and I saw several examples. The Wireless Power Consortium has created a wireless power standard called Qi:

It will be no surprise to see Qi stations in the office, hotels, airports, railway stations as part of the normal infrastructure that offers wireless power charging service

image

On the eCoupled site you can see some other ideas, like kitchen appliances that work simply by being placed on a powered surface:

eCoupled will one day integrate into the walls and surfaces of your home. If you’re watching the big game, the TV won’t need to be plugged in. Power will be delivered wirelessly via the eCoupled-enabled wall. In the kitchen, a multipurpose countertop will allow you to mix, chop, blend and boil all on the same powered surface. There will be no cords to plug in, or outlets to worry about.

The technology allows data transmission as well, so the glowing cereal box can also report when it has passed its sell-by date. Now that might actually serve a purpose.

Farewell to Bizarre Creations, makers of Project Gotham Racing and Geometry Wars

It is hard to understand why some of the best game studios go out of business, while lesser ones (I am not going to mention names) continue. The last time I felt like this was when Ensemble Studios, makers of Age of Empires, was closed by Microsoft. Today it is the turn of Activision’s Bizarre Creations, based in Liverpool.

There are countless racing games, but when I encountered Project Gotham Racing on the original Xbox I knew it was special. It is a hard game which rewards skill; you will not get far if you simply try to charge round the track. It is also street racing, with superb graphics capturing well-known locations like London and San Francisco. The graphics got better in the later versions of the game, but for gameplay I still have an affection for the first one.

image

The other I will mention is Geometry Wars, which started as a mini-game in Project Gotham Racing 2, but came into its own as an arcade game for the Xbox 360. This one cannot be captured in a screenshot: it is where gaming meets art, creating fantastic visions of light and colour as you charge round the screen trying desperately to stay alive.

image

Apparently the relative lack of success of the critically acclaimed but poor-selling game Blur, released in 2010, was the beginning of the end for Bizarre Creations.

Laments and memories can be found in the official forum here and also on neoGAF. Farewell video here.

Thank you to Bizarre Creations for some of the best gaming moments of my life.

Motorola Atrix – the future of the laptop?

I took a closer look at the Motorola Atrix on display here at Mobile World Congress. This is a smartphone built on NVidia’s Tegra 2 dual-core chipset. I’m interested in the concept as much as the device. Instead of carrying a laptop and a smartphone, you use the smartphone alone when out and about, or dock to a laptop-like screen and keyboard when at a desk. The dock has its own 36Wh battery so you are not tied to mains power.

image

image

image

The Atrix has a few extra tricks as well. HDMI out enables HD video. An audio dock converts it to a decent portable music player.

image

The smartphone also morphs into a controller if you use Motorola’s alternative dock, designed for fully external keyboard and screen.

image

It is a compelling concept, though there is a little awkwardness in the way Motorola has implemented it. The Atrix has two graphical shells installed. One is Android. The other is a an alternative Linux shell which Motorola calls Webtop. While you can freely download apps to Android, the Webtop has just a few applications pre-installed by Motorola, and with no official way to add further applications. One of them is Firefox, so you can browse the web using a full-size browser.

The disconnect between Android and Webtop is mitigated by the ability to run Android within Webtop, either in its own smartphone-sized window, or full screen.

Personally I prefer the idea of running Android full screen, even though it is not designed for a laptop-sized screen, as I do not like the idea of having two separate sets of apps. That seems to miss the point of having a single device. On the other hand, Webtop does enable non-Android apps to run on Atrix, so I can see the value it adds.

Leaving that aside, I do think this is a great idea and one that I expect to become important. After all, if you do not think  Tegra 2 is quite powerful enough, you could wait for some future version built on the quad-core Tegra 3 (name not yet confirmed), which NVidia says is five times faster, and which may turn up in Smartphones late in 2011.