Eternity II

I’ve been interested in this puzzle for a while. It’s a 16×16 jigsaw puzzle, with each piece being split diagonally into 4 quadrants with different designs. You have to fill the jigsaw puzzle mat with the 256 pieces such that all the edges match up to ones with similar designs. Here’s a picture to give you an idea:

Yes, I did go out and buy the puzzle. It looks far more formidable in the flesh (or rather, in the cardboard).

A friend and I discussed the puzzle at length and he’s been working on a simple solver, just for fun. There are a couple of solvers already out there for this. I’ve got the edge sets worked out, and it’s fun to play around with some ideas for algorithms and solvers. The puzzle is an edge matching problem, and is NP-Complete. As such it has no ‘easy’ way of being solved. Interestingly, it falls into the same complexity class as packing puzzles, which was the idea behind Eternity I (which I also own, but I did not like as much as it was really easy to knock all the pieces out of position). Interestingly the guys who solved Eternity-1 helped design this second incarnation. Here’s a page explaining how they solved the first version.

The Wikipedia page says that the creators designed the puzzle to have a low number of exact solutions, to try and make it even harder (how rude). I read somewhere that the number of exact solutions is estimated to be ~20,000. Not a lot considering that the number of ways of placing the pieces is ~10600.

So how do you go about solving an intractable problem?

Here’s a guy who’s reviewed 2 solvers, one C++ based and one Java based (although the Java one is a port of the C++ one). There’s also a sorceforge project dedicated to this. There are also a couple of distributed computing efforts to try and solve the puzzle (essentially farming out tasks to a large network of computers, similar to the SETI@home project). Although one might ask how the prize would be fairly split in the case of such a syndicate win, especially if the participants contribute different fractions of the total computational time.

For a more technical insight into this class of puzzles, I suggest reading this paper.

I was wondering what kind of heuristics could be applied to this situation. The solvers I have seen so far all seem to be brute-force based. Surely using a simulated annealing approach in combination with some kind of random swap / GA would be better? If you could get a large proportion of the pieces fitted, say 80-90% using this approach, then you could swap to a different approach for the last part (even a brute force method). Knowing how long to anneal the system for is the tricky part. It would be interesting to define a fitness function as, say, the number of satisfied edges, and then plot this value as a function of time, to monitor the convergence rate. You could quickly get an idea of how interrupting the annealing schedule with, say, different rates of random swaps (mutations), influences the convergence rate.

The problem could also be broken into sub-problems, which can be solved in parallel. The obvious example being that of solving the fitting of the edge pieces. The edge pieces are defined by one of their quadrants being grey, and the corner pieces have 2 grey quadrants. (I wonder how much more (or less??) complex the problem would be if the edges were not defined as such, and all the pieces were of the same format?)… OK maybe I’ll think about that later…

I was also then naturally led to ponder whether or not this problem could be mapped onto a model which was compatible with quantum computer software. I.e. if a solver could be written for this specific puzzle which would benefit from a quantum speedup, using some kind of quantum annealing technique. Any takers? 😀

Well there’s only 68 days remaining to solve the puzzle if you want to win the $2m prize!

Advertisements

Genesis Machines

Genesis Machines by Martyn Amos

A nice introduction to several topics. I know next to nothing about genetic engineering – which is odd seeing as it was a topic I was extremely interested in before I turned to Physics, and I’d like to know more… so it’s nice to read about the way that bioscientists actually manipulate strands of DNA and the tools that they use in wet-world. A nice introduction to the Polymerase Chain Reaction (PCR), and the early work of Watson and Crick, the book is written from a historical perspective that gives you a feeling of really ‘following the field’ as it emerged, by explaining the key players and early proceedings of conferences, as well as both disappointments and breakthroughs in the key lab experiments. It also had a nice section on SAT and graph theory from a DNA computing perspective. Graph theory deserves more appreciation; it’s just cool.

The book spent a fair bit of time labouring the point, especially describing how the experiments were performed using analogies from popular culture, but I find this happens with most pop-sci books. I guess it depends how good you are at data-mining literature as to whether or not you need to have important concepts repeated.

This book however did make me smile; I could tie in lots of links from the world of Quantum Computing (although the author only mentions this once, and very briefly). Essentially DNA computing is massively parallel from the point of view that large numbers of molecules can compute all possible solutions to a problem at once, as the strands ‘stick together randomly in different configurations’ in a test tube, and the strands encoding the correct (or optimal) result are kept. Which means you can have a good go at solving small Hamiltonian Path Problems fairly easily. However, it does not mean that it is anymore computationally powerful than a classical computer – just that you have more resources available (space resources in this case – equivalent to having a silicon chip with an enormous transistor density). For example the time taken to solve NP-complete problems still grows exponentially with the problem size.

I’m not convinced that it’s quite as cool (or scalable, or viable) as Quantum Computing. And is having a robot (or set of mechanically activated valves) that mixes together tubes of solutions together inside your PC any less far fetched than having a computer with an onboard dil-fridge cryocooler? I think not. Although it would look awesome if the DNA container cells were all backlit with LEDs 🙂

Whether or not this field will continue to grow, I do not know, but it is nice to read about alternate implementations of computation. I think DNA computing will have important ramifications in medicine and molecular biology. For example, one of the ideas reported in the book was to use DNA computers to control drug production factories within the body itself.

As ‘wetware’ technology and engineering improves, one area that I am very interested in following is bio-electronics research (particularly the interfacing of conventional electronic circuitry with the human nervous system). I can envisage a fruitful collboration between biotechnology and conventional silicon processing, which I hope I live to see (and perhaps participate in).

… and I wish we’d had LEDs to light the glassware in organic chemistry lessons.

Coffeebreak Links 191008

Science:

Some nice video demonstrations
of science concepts (especially biology based ones)

I love this website.
How many have you heard of?

Pop-Sci book reviews galore, yey!

This year the Royal Institution Christmas Lectures are on the subject of miniaturising electronics, computers of the future, AI, etc. The lectures are aimed at kids, but I usually watch them anyway, they are quite fun 🙂
Check out the event calendar for the lectures which are given by Professor Chris Bishop, with the title “Hi-tech Trek – The Quest for the Ultimate Computer”. I wonder if he’ll mention quantum computers…

Arxiv:

From the arxiv blog: Modelling the spread of religion with a model usually applied to viruses. Interesting… and sure to cause some controversy as usual.

Whilst on the science and religion theme… Frank Tipler on testing the Many Worlds Interpretation Not sure about this… well here’s his homepage and here’s some information about Omega point theory.

Just for fun:

I have no idea why I find this so cool 🙂 A rather surreal and arthouse interpretation of Quantum Computing:

Fridge torture

Here is a picture of what I did yesterday.

Torturing of the dilution refrigerator is the only way to diagnose what is one of the experimentalist’s worst nightmares…

‘The Intermittent Fault’…

I have had intermittent faults with this system for about 3 years now. The main problem arises from the use of tiny stainless steel coaxial cables (0.3mm outer diameter). You can’t solder to stainless steel without using an acid flux. If you use this, the connectors fall apart due to corrosion after the first few thermal cycles. You can crimp the connectors onto the coax (it’s fiddly) but then the problem is that thermal cycling can cause mechanical stresses which then just break the inner wire, (not difficult considering it’s thinner than a hair).

Anyway back to the torturing. The faults in the wiring only occur at low temperatures <- That’s the important point by the way. You can’t check continuity of a wire when it’s deep down in a sealed dewar immersed in Liquid Helium.

There are 4 signal lines along which these awkward faults can occur. The way to find them involves dipping the lower part of the dilution refrigerator slowly into an open bucket of Liquid Nitrogen until the fault appears. The temperature gradient above and below the liquid surface is high so you can accurately pinpoint the problem area in terms of vertical position. Using a heat gun (which looks similar to a hairdryer, but goes up to 650C, hehehe) on the part just above the liquid nitrogen surface helps increse the temperature gradient. It creates pretty Nitrogen fog too. (a bit wasteful, but LN2 is cheap).

Poor fridge.

The coaxes have now been temporarily replaced with twisted pair wiring as multiple connectors on 3 of the 4 lines were found to be faulty. It’s also a really great way to find dry solder joints. NEVER rely on the fact that you can create 100 or so solder joints that will all remain robust – even if you’ve been soldering since you were about 8 years old 🙂

EUROFLUX2008

I’ve just returned from the EUROFLUX2008 conference in Naples. The conference dealt with two main aspects of superconducting electronics: Fundamental device progress and superconducting devices as detectors. (I was mainly in with the first bunch).

A few conference highlights:

Devices

Valery Ryazanov gave a good presenation on the progress of pi junction applications for digital and quantum logic. The measurements suggest good quality qubits containing SFS (Superconductor-Ferromagnet-Superconductor) pi junction components have been fabricated. This is something I’m really pretty interested in. For RSFQ applications, pi junctions are useful as they act as phase inverters. In the case of qubits, a pi element can introduce a phase shift across one of the Josephson junctions in the loop. The ground state of the qubit loop will then consist of a spontaneously generated circulating current, a half-integer magnetic flux quantum in the loop, and most importantly a double well potential for no applied bias (aka a free lunch). Such qubits are intrinsically thought to be more ‘quiet’ – as you do not need to apply a flux bias to bring them to the working point. The presentation at the conference gave evidence that pi-qubits may be as good as conventional ‘zero’ types. And because they are Niobium SFS, they are (somewhat) compatible with existing process technology, and the best thing is that they’re low Tc, avoiding all that HTS messiness.

I think people still slightly hesitant that magnetic materials and superconductors would ever work in harmony (I spent a year investigating spin-injection devices and came to the same conclusion), not to mention the fact that having magnetic impurities all over the place is not really ideal for a component that’s incredibly sensitive to variations in field. Yet somehow they work. Go physics. Absolutely fantastic progress in this field.

See here, here, here, here and here for a few papers on SFS junctions (there are loads more).

Maria Castellano – High Frequency coherent oscillations of a flux qubit manipulated by fast pulses. The experiment involves Niobium tunable flux qubits. A superconducting loop interrupted by a double Josephson Junction loop (DC SQUID) gives you two handles on the shape of the double well potential – the barrier height and the asymmetry (bias). The group demonstrate coherent oscillations between 6-25Ghz. Operating the qubit at high frequencies means you get more oscillations for your money, and thus more (potential) ops in those vital nanoseconds before decoherence kicks in.
See here and here for some more info.

Detectors

IPHT (Jena) are developing THz cameras for security applications.
The cameras scan people passively (e.g. at airports) and can detect hidden items. Metallic items e.g. guns, knives etc. block the THz emission from the human body, causing them to appear as shadowed regions to the camera. The detector can also detect non-metallic ‘suspect’ items. These cameras are really coming along, and I perhaps fully working systems will become commonplace within the next 5 years or so. You can read more about the progress here.

SQUIDs for magnetic exploration, again from IPHT. These guys hang a cryocooler containing a SQUID from a helicopter and raster-scan the ground to detect magnetic objects. Items of interest could be unexploded landmines, large mineral deposits, or the buried remains of medieval structures.

Conference

The social event was a visit to the excavated city of Herculaneum, buried by the eruption of Mount Vesuvius in 79 AD. It was rather cool to see all the roman technology of the time.
In true Italian style, the conference dinner was an epic six-course meal. The food in general was fantastic, and the weather in Italy was also great (I think we left just before it began to get really hot). Naples itself seemed to have lots of palm trees:

It was a very useful experience overall, and the conference was small enough that nearly all the talks were relevant to my own areas of interest. The general idea of the conference was a gathering of all the people involved in superconducting electronics in Europe (although the conference was open to an international audience) to further progress in the FLUXONICS project, in the area of Niobium Josephson junction applications, and in the European RSFQ technology.