Friday, March 29, 2013

Love and the Singularity

One of the milestones of the singularity is a computer capable of passing the Turing test: can you tell the difference between a computer and a real human using only text messages? It manages to be both subjective and definitive in that you cannot continue to maintain you prefer humans to machines if you can no longer tell them apart, much like the Pepsi Challenge.

In the context of the singularity, the Turing test is a canary in the coalmine. It is not practical as such, although I've always felt it would make an amusing stage in the Miss World contest, instead it signifies that computers have stormed the last fortress of human uniqueness (interestingly a synonym of singularity), and that they are now likely to be capable of our most precious gift: original thought.

But where is love? That most powerful part of the human experience and the driver for so much of what we do (both good and bad). A computer may become powerful and creative enough to cure malaria, but what is its motivation to do so? A human scientist does so for a mixture of love for his fellow humans, a love of the terrible beauty of biology, and a love of doing something he or she is good at. I expect all of these things are an important part of the creative process, possibly even essential, but I am not suggesting the capacity for love is somehow uniquely human and that this will prevent the singularity - it's not that kind of essay - I'm simply pointing out that truly intelligent computers may require the capability for love. Personally I find it harder to imagine a computer writing an ode than one beating me at chess; a feat already achieved by my treacherous cell phone.

If computers could love humans, or even if they couldn't, could we love them? This has significant consequences and shares some of the traits of the Turing test: love is the ultimate subjective emotion, but loving a computer would be a definitive step and a harbinger of societal change. It is hard to feel that computers are soulless automatons bent on destroying humanity if you are in love with one.

There are many types of love of course. Some of us already love our computers as we might love a treasured slide rule or sports car. It is not hard to imagine loving a computer as we love a pet: a condescending love of unequals. It is conceivable that we could love one as we love a colleague possessed of great insight and ability. But could we love a computer as we love a friend, a child, or a partner? If a computer were both lovable and orders or magnitude superior to us, would we not love them as gods? Each of these represents a different level of acceptance of the singularity and they represent increasing levels of disruption.

It is argued that a great disparity between human and machine will not arise because humans will be augmenting their own capabilities to match those of the artificial intelligences, and for the individual this may be viable, but no man is an island. For instance, a good marriage depends on growing together, but if your wife suddenly became a million times smarter than you, it's a good bet you would struggle with pillow talk. So do you only "upgrade" together? What about your children and your friends? These are complicated questions that could cause real pain during the constant revolution the singularity promises, but without love at the heart of it, it would truly be post-human.

If I speak in the tongues of men or of angels, but do not have love, I am only a resounding gong or a clanging cymbal. If I have the gift of prophecy and can fathom all mysteries and all knowledge, and if I have a faith that can move mountains, but do not have love, I am nothing. If I give all I possess to the poor and give over my body to hardship that I may boast, but do not have love, I gain nothing.

Friday, March 23, 2012

Android Emulation on x86 Machines

It was news to me that Android doesn't just work on ARM processors, but apparently it has been running on x86 machines for a while.

Now thanks to some wizardry from Intel, the performance has been significantly improved by taking advantage of virtualization. Although x86-based Android devices aren't going to take over the world anytime soon, this does mean that we now have a viable Android emulator to use on the desktop.


The instructions are here and I suggest you follow them all. The only sticking point is if your machine supports Virtualization Technology or not. You can find that out by checking Intel's list or running their processor information tool. Even my budget ACER i3 laptop has it:

Then you need to run the Android SDK Manager and download version 17 of Android SDK Tools and the x86 system image:

One limitation is that the only AVD image available at the moment is for Android 2.3.3, but (depressingly perhaps) that covers 95% the devices out there. There is also an Intel configuration tool (in the Extras section) that you must install and run (see the aforementioned instructions). Note that the virtualization manager is memory hungry (I recommend a 2GB allocation), but RAM is so cheap that maxing out my laptop to 8GB only cost about £30 (it's the best investment you can make).

When you next open Android AVD Manager, you should be able to create a new x86-based VM:

Configure to your hearts content and launch...

You should find it a much faster experience than the ARM emulator and identical for Java-based applications. I'm not sure if NDK applications would work the same - comments welcome!

The possibility of developing applications completely on one machine really lowers the barrier to entry.

Tuesday, November 29, 2011

Android Permissions - Protection Levels

Android applications declare the permissions they are likely to require in their manifest (a short file that describes the contents of the 'package'). This allows the system to sandbox them from critical resources and gives the user some indication of what havoc they might reap. That's the theory at least, but the first time I installed an application and read the permissions page I had no idea what they were on about! Clearly this system needs to be changed, but that is not what I want to talk about today.

As an application writer I need to know the protection level of these permissions, i.e. which of these permissions are normal (can cause the user no real harm), dangerous (might require a greater level of trust, such as the ability to read SMS messages), signature (only granted to applications that are signed by the people who built the OS), signatureOrSystem (like signature, but also allowed if they have been pre-installed in a system folder). I was surprised to find no easy reference for this in the documentation, but I did find the relevant information in the source.

You can of course probe the android package itself for this information, which is useful if you don't have access to the particular version of Android you are running. Here is some code that does just that:

// Get the permissions for the core android package
PackageInfo packageInfo = getPackageManager().getPackageInfo("android", PackageManager.GET_PERMISSIONS);
if (packageInfo.permissions != null) {
  // For each defined permission
  for (PermissionInfo permission : packageInfo.permissions) {
    // Dump permission info
    String protectionLevel;
    switch(permission.protectionLevel) {
    case PermissionInfo.PROTECTION_NORMAL : protectionLevel = "normal"break;
    case PermissionInfo.PROTECTION_DANGEROUS : protectionLevel = "dangerous"break;
    case PermissionInfo.PROTECTION_SIGNATURE : protectionLevel = "signature"break;
    case PermissionInfo.PROTECTION_SIGNATURE_OR_SYSTEM : protectionLevel = "signatureOrSystem"break;
    default : protectionLevel = "<unknown>"break;
    Log.i("PermissionCheck", + " " + protectionLevel);

...and here are the results in case you need to know them at a glance...

PermissionProtection Level

Thursday, June 30, 2011

Cloud Production

I've always been fascinated by 3D printers and recently ordered one from the crowd-funded Huxley project.

This project has raised far more than the initial expectation, and has forced the company that runs it to increase it's production capabilities by an order of magnitude. Now because these printers can print parts for themselves, eMaker are reaching out to other 3D printer owners to help them cope with the demand.

Imagine a future where you order something online and rather than coming from a local warehouse it is manufactured in a local facility that can make anything. As long as the quality is acceptable you don't have to know or care where it came from. The benefits of this from a production and supply chain point of view are enormous and efficiencies in the supply chain would mean cheaper goods for consumers. Couple this with the environmental benefits (less transportation and waste) and you have a game-changing technology: Cloud Production.

Friday, May 27, 2011

Passpack - Online password management

Did I bore you about Passpack yet? If not read on...

Passpack is a website that manages your passwords and other login details. It is simple to use and allows you to share passwords with colleagues and family members.

For instance, if you add a link to the login page for the site, it will auto-complete the login fields (using a bookmarklet).

I can also recommend the automatic password generation, which helps you avoid principal danger of password reuse.

It has a neat system whereby the passwords are decrypted locally in the browser using your security pass-phrase. This means that even the Passpack folks can't see your passwords. Of course this means you should keep you pass-phrase written down somewhere - I recommend keeping it with your will :-)

For extra simplicity, you can login with your ID from Google, Facebook, Twitter, or and OpenID provider. This doesn't help with the pass-phrase, but it stops you needing two passwords: one to login and one to decrypt your data.

Until true SSO is a reality, this makes identity management much simpler.

Thursday, May 05, 2011


I often want to share YouTube videos with my kids, but they are surrounded by links to other videos, which can often be unsuitable. ShareSafe.TV displays only the video you want to show and nothing else. Use their link generator or just add v/<video id> to the end of their URL.

Here is an example:

Wednesday, April 20, 2011

MSTest and 64bit

This post is about running MSTest for applications that target mixed platforms.

If you are lucky enough to be able to write your applications in pure .NET, then you may never encounter 32bit/64bit platform issues. However, if any dependent library or plug-in is compiled for a specific architecture, then your whole application must be run in that mode. This is why the default Window's Internet Explorer is still 32bit despite the 64bit version shipping since Vista: it has to be the same architecture as any legacy plug-ins. By contrast, Notepad doesn't have any plug-ins, so it can get away with being 64bit only.

My companies applications rely on many native libraries, which are obviously compiled for specific architectures (x86 and x64). Deploying an application for multiple target processors is a complex subject in itself that can be solved with a range of strategies from dynamic library linking to processor-specific installers, but however you deploy, your application will behave differently in these two different modes so they must both be tested.

For better or for worse, we use MSTest to control application quality. Since the release of Visual Studio 2010 this has been able to run in 64-bit mode as well as 32-bit mode, but there are certain subtleties that complicate the practical aspects of administering your tests.

To understand the problem, consider the way MSTest works: Testing is done using two programs MSTest.exe and QTAgent32.exe. MSTest is told what assemblies to load and it scans those assemblies (using reflection) to find any classes and methods annotated as tests using the various test attributes. To do this it must be able to load the assembly and all its dependent assemblies and because MSTest is a 32bit process, none of these assemblies can be exclusively 64bit. Once loaded, MSTest instructs QTAgent32 to run these tests, which means QTAgent32 must load the assemblies itself and execute the test methods, but because it is also a 32bit process it cannot load 64bit assemblies either.

In Visual Studio 2010 a new version of QTAgent32 was added called QTAgent.exe, which can run 64bit assemblies. This means that even though MSTest is still 32bit, QTAgent can execute in full 64bit mode so that pure .NET assemblies can now be tested in 32bit and 64bit mode. However, it still doesn't easily allow applications with mixed-mode assemblies to be tested in 64bit mode because they cannot be loaded by MSTest in the first place.

One interesting solution to this is to force MSTest.exe to be a 64bit application. This implies that MSTest is actually pure .NET code anyway, but has been forced to run in x86 mode. If you are going down this road, note that MSTest relies on various registry entries (HKLM\SOFTWARE\Microsoft\VisualStudio\10.0\EnterpriseTools\QualityTools\TestType and HKLM\SOFTWARE\Microsoft\VisualStudio\10.0\Licenses) to decide which extensions it can handle and what features are licensed for use, and that these are installed by default to the WOW6432Node registry "shadow" branch. To run in 64bit mode you must copy some of the registry entries over as well as editing the binaries themselves.

There is an alternative approach that doesn't involve editing executable files and local machine registry settings (which can be a pain across a large development team). Our application builds for two distinct platform targets x86 and x64 (note however that most assemblies are compiled as "Any CPU" except the ones that contain native code) and test projects are only built in the x86 solution configuration. This ensures that the code in their bin folder is 32bit compatible, and they can therefore be loaded into MSTest. Also, the tests are configured to run against the binaries in the actual application deployment folder rather than running in their binary folder using the new root folder feature of Visual Studio 2010:

In this case we have used an environment variable that signifies where to find the deployed binaries. Whether you do this or not, it seems to always want a full path for one reason or another, which can make supporting multiple development environments a challenge.

We make one of these test configurations for each target platform, remembering to change the Hosts section that controls 32bit and 64bit execution. Then we can run both configurations from the command line like this:

mstest /testsettings:WorkStation32.testrunconfig /testmetadata:SOLUTION.vsmdi
mstest /testsettings:WorkStation64.testrunconfig /testmetadata:SOLUTION.vsmdi

The trick here is that in both cases MSTest will load the 32bit binaries to decide what tests to run, but the different configuration files will control if QTAgent32 or QTAgent is used. Note that this cannot work with the /noisolation switch, because MSTest cannot host the 64bit binaries.

The disadvantage of running your unit tests against the deployment folder is that your tests are less "clean" and test failures could take longer to diagnose. The advantage is that the tests are being run on code as it will appear in the wild, which can include complex deployment features such as assembly obfuscation.

This system will work on development desktops and build servers. It may give the Visual Studio IDE pause for thought occasionally, but it is fundamentally compatible, which is one of the only real advantages of MSTest in the first place.

Friday, February 18, 2011

Wednesday, May 12, 2010

Microsoft's Click-to-Run and Office Automation

Today's lesson: When Office is installed using Click-to-Run, it doesn't support automation.

We use Excel automation via C# in our application, and when testing against the new versions of Office we hit a bump in the road. Office Home and Business 2010 typically installs via Click-to-Run, which is designed to have a small footprint, and as such does not register itself for programmatic automation.

So when you hear this sound: "Retrieving the COM class factory for component with CLSID {00024500-0000-0000-C000-000000000046} failed due to the following error: 80040154" even though Excel is apparently installed, you probably have this issue.

More here about Excel automation and the expected registry keys.

It is suggested that there will be an alternative MSI-based installer that presumably will not have this problem.

Wednesday, February 24, 2010

Vertical Alignment in CSS

I know I should use CSS, but sometimes I fall off the wagon and use tables instead. This usually happens when I want to vertically align content. Here are some very clear hints and tips about vertical alignment that may help me kick the habit: