Feb 23

Is NASA using entangled particle quantum communication?

I’m only halfway through this but it seems to shatter my previous assumptions on what NASA is doing. https://astroengineer.wordpress.com/2010/04/07/a-curiosity-of-spirit-full-document/

AstroEngineer’s Blog

The stories of my slow path to awakening.


I am publishing this story somewhat hastily in response to the hue and cry on the Above Top Secret forums. Apologies in advance for those areas that I would have improved or omitted entirely had I spent more time thinking about what others wanted to hear and less time thinking about what I wanted to say. I attempted, where it occurred to me, to explain the terms and workings of the things I mention, but I may have forgotten to explain some items which could cause confusion and/or Google searches. I will attempt to update this document in the coming weeks as I have time, to improve the clarity and eliminate elements which may unnecessary or uninteresting to others.

This began and remains a personal journey, an attempt to reconcile and understand my own history and work. If you find it interesting, great. If you don’t, oh well.

The experience I’m about to relate marked the beginning of the end of my NASA work, the end of the beginning of my awakening. The transparent and noble institution I believed I knew was hiding something.

For those who want a synopsis, I’ll try to post one in a few days. Most of you already know parts of it, and I felt it most important that I first get down as many of the specific details as I could to ensure that people wouldn’t accuse me of the greater crime of making extraordinary claims without providing context and specifics.

It is fitting that I publish this today, the 9th anniversary of the launch of Mars Odyssey, the orbiter that made possible so much of the Mars rovers’ good science (in spite of all who have sought to limit it).


P.S. – I hastily inserted chapter headings to try to break up the text logically and make it more readable. These don’t correspond to the original posting in parts.

Chapter 1: Spirit’s Missing Time

Last January I was at JPL on a project when a colleague of mine in the unrelated MER program was working a Spirit glitch. Spirit is one of the two rovers still operating on Mars. Spirit’s expected mission was for a little over 90 days long, but six years later the rover is still operating. Spirit has performed well beyond expectations, but even the best robot is going to have its off days. One of these off days was in late January 2009. My friend and colleague, who I’ll call Rich, was a senior software engineer on the team tasked with maintaining the mobility flight software, the code that controlled the rover’s movements and experiments.

Spirit had refused an instruction. It had been sent instructions to move, it acknowledged receipt of those instructions, but it did not move. In and of itself that was not highly unusual, the rover is given license to ignore move requests it does not believe it can fulfill successfully or safely. But not only did Spirit not move, its non-volatile flash memory was missing data about its motionless hours. Through a separate subsystem they were able to estimate that it had been awake for at least an hour during the gap, but what it had been doing or why it had decided not to move was unknown. Imaging before and after the event showed no change, cameras, IDD, suspension values, terrain; all were the same. This was the mystery my colleague and his team were engaged in solving. Why did the rover ignore its command to move? Why did the rover record nothing? Presumably the one question would answer the other.

They ran a battery of system checks and all appeared nominal. A refusal to comply can have its origins in the rover having lost track of its orientation, so they attempted to recover this with an on-board program that uses the panoramic camera and accelerometers to locate the sun and determine its own orientation from that. After an initial unrelated failure in this procedure (the accelerometer package was off) they were able to reacquire orientation. After a bit more testing and investigation, with more nominal results, the rover driver (RP) once again prepared move orders that were vetted and ultimately uplinked. This time the rover moved as expected, and recorded its activities to flash.

The most probable explanation for the initial faults (the failure to move, memory loss, loss of orientation) was cosmic rays. Cosmic rays are energetic particles, usually protons, capable of disrupting electronic systems they come in contact with. With all systems once again performing as they should, it would be hard to explain any other way. And this can happen, cosmic rays have been suspected in the transitory failure of other satellites, orbiters, and landers. The potential for problems has been amply demonstrated in labs on earth, and as a result Spirit’s electronics are somewhat hardened, to the degree reasonable for its expected exposure.

As part of working the problem, before Spirit had reacquired orientation, Rich had done a few things to dig deep into the mystery. He had dumped portions of Spirit’s memory to see what he might find through a forensic examination back on Earth.

Spirit has three kinds of memory: 3 MB EEPROM (like a bootable thumb drive, where the VxWorks OS lives), 128 MB RAM (like your computer’s memory), and 256 MB flash memory (like your hard drive). He would have loved to dump everything, but that was completely impossible. Spirit’s downlink is far too slow and there are limits to what you can safely dump from a running system. Downlink usually comes from a combination of UHF to orbiter to Deep Space Network (DSN) and HGA direct to DSN. UHF is faster than HGA but limited by having only two 10 minute passes of the orbiter each sol in which to transmit data for relay. HGA is much slower but the window is considerably longer, three hours in many cases, ultimately limited by power, temperature, and visibility of earth. Daily downlink capability is only about 9 MB per earth day. So, Rich focused on building the sequences to dump what he suspected would be key portions of flash and RAM. After thorough testing he got the TAP/SIE to send these on the next uplink.

The commands were in fact sent, which was actually a pretty big coup for Rich. The practice of dumping memory for analysis is a common technique for debugging software on earth, but rarely (possibly never) performed on a vehicle in space. In this case it was allowed because the specific commands were deemed very low risk, and the uplink and downlink bandwidths were otherwise unused as the engineering teams worked the problem in simulator.

Rich pulled down about 7 MB. He was after regions of flash which might hold evidence of a damaged table of contents (TOC), perhaps even hints of the missing records. He was also looking for regions of RAM that might contain loaded low level OS code, possibly revealing unusual EEPROM issues. Also included in what he pulled down were small data samples taken across the entirety of the flash memory. He had a pet theory that a cosmic ray memory failure event may leave some sort of signature in the data on the failed flash memory. He was able to slip this 256kb flash sampling past the TAP/SIE censors in the guise of sampling for the evidence of missing records, which arguably it could have been. He received the data, but had no opportunity to examine it before Spirit was deemed restored.

Rich’s regular schedule resumed and it was several days before he had the time to dive into the data. A terrestrial memory dump is no special thing; it is a computer’s dejecta, something a software engineer wants to deal with only when he must. But a memory dump from a craft on Mars? Now that’s something special. So on his first day off he began to examine what they’d pulled down. The first few hours were spent writing code to help the analysis, to separate the single block of data received back into its original and separate parts, and to place them in a proper context so his debugging tools could assist in the analysis.

A few hours later he had found the flash’s TOC and found they contained no evidence of records corresponding to the rover’s unresponsive time. This was fine, it was merely a possibility he wanted to check, since the presence or absence would hint at the nature or timing of the initial failure. He looked at the samples he took across the flash drive, and found nothing of absolute significance. He knew that was a long shot, and that he would likely need to collect more data from other similar failures, likely from test labs on earth, before he could see if there was a real case to make. He now turned his attention to the RAM he dumped. He was targeting the region most likely to contain the code and stack of the applications core to OS and flight software. Analyzing this would be painful, and possibly fruitless. RAM is very dynamic and while it’s parceled out in a fairly orderly way at first, as each application requests and releases memory it very rapidly becomes chaotic and a useful forensic analysis is nearly impossible. At best he was expecting to find chunks of programs with signatures he could match up with code from the EEPROM he knew to be on-board, as well as the data (the stack) from some of those running programs. What he would be looking at would be machine language, not the original high level language, C. If he found anything interesting he could use decompiling tools to give him a leg up in understanding the machine language he was seeing by turning it back into the more readable C. His days off were over before he’d found anything of interest, and it wasn’t until his next time off that he was able to give things another look.

Over the next week Rich had an idea for a tool to help him understand the RAM dump. He wrote a little utility to sample a dump of the version of the EEPROM most recently uploaded to Spirit, generating signatures (context triggered piecewise hashes) for the various pieces of application code on it. The signatures themselves were made of short regions of the applications so in this way he might be able to better match the fragments he would have in the RAM dump. He finished writing the tool and generating the signatures during the first half of that next day off. He then ran it against the RAM he dumped and was able to identify quite a few sections of application code. He began to check (diff) each section of recognized application code against the EEPROM source. All were as expected except for regions of code related to telecom. Only portions of the RAM dumped version matched the EEPROM version. There was nothing yet startling here, perhaps he was comparing RAM to an older EEPROM dump, perhaps this section of the rover’s memory had been unloaded and the region reused while he was dumping it, or perhaps there was a bug in the code he wrote to break up the original monolithic dump file. He double checked everything. He was using the latest known rover code. He didn’t see anything wrong with his parser. The only thing he could do was to see whether or not the regions which differed looked like they were actually part of the original telecom code. He found some later regions which were, and this strongly suggested that either portions of the original code had been damaged or that the portions were simply different by design, representing a different (older or newer) version of the code. He used his decompilation tools to examine the entire section of application code he could reconstruct and marked the regions which were different. He then began to walk those sections of code. Sure enough the code was different, and very oddly so. The code dumped from the rover included function calls referring to a device accessed via the VME interface (similar to a computer’s PCI slot) which Rich did not know about.

Chapter 2: The Mystery Device

You work on a project like this and there is little about it which remains unfamiliar, so the apparent discovery of a device you didn’t know was on board gets your attention. Rich’s initial assumption was that this was entirely benign, he was sure it was a piece of vestigial code meant to connect to a device which the science or engineering teams had decided post software spec couldn’t go to Mars. And despite best efforts, old code can sometimes reappear in deployed software. As separate developers merge their own changes back into a source tree, things can creep back in. If the leftover and unused code caused no problems in testing, then likely no one would notice and the code would be deployed. Rich checked the source code repository for any sign of this code in that application, in any earlier versions. Nothing. He then looked at the application’s stack from the RAM dump and found evidence that several of the mysterious functions had actually been called. This code was not vestigial; there was assuredly some device on the VME bus with which the code was interacting. As he examined the decompiled code from Spirit, the decompiled official terrestrial EEPROM code, and the official checked-in source which compiled to the EEPROM he began to piece together how this code appeared to be used. It looked as if this code was executed when the telecom package was instructed to transmit data back to Earth. The raw, experimental data files on flash memory would be passed to this mystery code where they would then be sent to this mystery device. The image data from flash appeared to first be re-processed by the ICER (MER’s image compression) library to drop the resolution before going to the mystery device; the images passed to the device were akin to thumbnails of the originals. This is where I got involved.

Rich and I had worked together quite a few times, and we’d even talked about opening up our own consulting business back in the dotcom days, when a private space race first seemed a possibility. He wanted me to confirm that what he was seeing was real, and he knew if I was able to show him it wasn’t I’d do so without the harsh judgment that can make reaching out to others tricky. Years earlier he’d confided to me a UFO sighting he experienced while camping in Joshua Tree National Park. He didn’t know what to make of the experience, and neither did I. I don’t personally believe or have any knowledge that suggests UFOs are alien in origin, but neither do I have reason to disbelieve the very genuinely related and seemingly conventionally inexplicable experiences of a handful of friends and relatives. Maybe they were hallucinating, maybe they were mistaking natural phenomena, maybe they were mistaking civilian or military craft, maybe they were making up the story for attention, or maybe there were aliens operating the UFOs. I had no ability to know which was the case, and in the absence of sufficient evidence, I never felt the need to pick a pony in that race. So when he called me in, it made emotional sense to me. He was vague on the phone, just asked me to come over, told me he was looking through the rover dump and wanted a second opinion. His tone betrayed some of his excitement.

I got to his house on that Sunday around 7 pm. He related to me roughly the story above. I spent the next four hours having him walk me through each step, from the dump parsing through the EEPROM app signature generation and matching, to the discovery of the variant telecom code, to the isolation of the suspect functions, to the evidence of those functions being called, and on to what those functions appeared to do when called. I could find no break in the chain of his reasoning, every link was sound.

We were both pretty excited at this point, a mixture of the esoteric delight in knowing that we knew something few others did about a vehicle on another plane and the nervous knowledge that we likely weren’t meant to know this, and knowledge has had a cost all the way back to the Apple; we heard the whisper of the snake, and pressed on.

He wanted to gather as much information as we could before he headed back in the next day. He was as yet undecided on whether he’d bring this up at the first SOWG meeting the next day, or whether this was better left unremarked. He wanted to make as much progress as possible that night, in the hopes that his decision the next day would be clear.

We were sure the mystery device was a radio, the code required it to be. But the fundamental question on the table was, why was this mystery radio there, and what kind of radio was it? We couldn’t interrogate it directly and we’d exhausted what the code we had available could tell us, we were left with trying to logically argue what outputs made sense given its inputs. We considered many possibilities and hit upon two that seemed within the realm of possibility.

Mars Reconnaissance Orbiter (MRO) arrived on the Mars scene in the spring of 2006, two and some years after Spirit touched down. The MRO had a new UHF radio package called Electra, more capable than what Mars Odyssey offered. Perhaps the rovers had been given a VME card which would allow them to take advantage of the increased bandwidth Electra would offer? This seemed like such a promising possibility, it could explain why the code had hooks where it did and why the system wasn’t well known. Such a device in the rovers would likely never be used, as the rovers weren’t expected to live long enough to see the MRO. Even if the rovers were still around, changes to Electra between MER launches and MRO orbital insertion might mean this device was outdated and unusable. Electra was engineered to benefit Mars missions; legacy support might be highly desirable but not likely mission critical. But the more we looked at this, the most likely candidate, the less sense it made. Electra had in fact been used with MER, but so far as we knew only with the standard telecom package, and only for tests. If Electra was now being used via some new device why hadn’t we been alerted? Why was the code we looked at hard coded to send lower resolution thumbnails of the rover images? Downlink via relay requires the DSN and time on it isn’t lightly given. Any test sending redundant data like experimental data and image thumbnails would be short-lived and this code was hard-coded; it would have been active from the time of the last mobility flight software to the time of the next one. Updates are rare, and everyone knows when they occur; they cost up to two sols to implement, and generate considerable anxiety. No such updates were recent enough to have this be merely a test. Spirit would need to be broadcasting twice, once via the normal channel, and once via this alternate channel. No regular downlinks via the normal channels were being missed, deviation from that norm would have been noticed and talked about. With ongoing concerns about power levels and overheating, unnecessary and prolonged power drain which would result from the duplicate broadcasts would be noticed. We couldn’t entirely dismiss this possibility, since it was fit at least a few of the facts and all of our own prejudices. But we began to look for further evidence to support or refute it. And we considered the other likely explanation.

If the radio wasn’t using the UHF antenna, then it was most likely using the HGA to reach earth directly. This idea had a few powerful arguments in favor of it. The bandwidth possible with the high gain antenna was much less than that via the UHF antenna, and we knew the data being sent was also much less. Also, the dual transmissions could be explained as a safeguard, a redundancy. Packet loss from one transmission (via HGA or UHF) could be filled in with the reduced resolution portions from the second HGA transmission. But then we realized something that perhaps was most compelling from the human point of view. Humans are impatient beings, and sending the second transmission with thumbnails either before or somehow coincident with the primary HGA signal would allow more time for decisions to be made in planning nextersol. But if that was the case, who was receiving those images and experiment data? My colleague was sure no one he worked with was benefiting from it. We tried to prove the case of a second HGA broadcast. The first thing we did was try to compute the bandwidth this radio would be using, given what we knew about the data it would be sending. Crude calculations showed it would need to handle about 25 kbps. This was an important number because it was 50% faster than what the rovers could do with their HGA, and less than 50% of what it could do through the UHF to Odyssey. It seemed unlikely they were primarily using a slower radio for the HGA, and only secondarily using this faster one.

Next we looked at power levels. HGA transmission draws quite a bit of current, and power levels for the rovers are closely monitored; a beam of sunlight can’t hit a panel nor a wheel disturb a single grain of Martian soil without it being recorded as fluctuations in the rover’s power. Admittedly hyperbole, but surely we might be able to estimate the power this mystery radio was drawing by looking at power logs across the rover’s life. My colleague had access to enough material that night to show that whatever current the additional transmission was requiring was minimal, within the margin of error, far below the levels of either UHF or HGA (or even LGA). It seemed to be a frequently used, medium bandwidth, ultra low power radio. This seemed impossible, the power effectively determines the transmission distance, and at this power level, the radio would have effectively not been broadcasting.

We were about ready to go back to scratch and assume this was not broadcasting anything at all, perhaps it was performing some other minor function, perhaps it was obscure hardware image processing hardware for generating parity blocks used to add robustness to the images generated by the ICER library and triggered for some peculiar reason via the telecom code just as transmission is occurring. But we didn’t abandon our radio hypothesis just yet, couldn’t abandon our gut feeling that this was really a radio and that the most practical reason for it being there was to get a limited but useful set of sol data to ground before or in lieu of the full set.

It was roughly 4 A.M. when we gave up for the night. My colleague decided not to bring any of this up at SOWG. He and I would regroup the next day after work and with clearer heads see what we thought, and what we thought we could do about it.

Chapter 3: Looking for a Receiver

Between breaks at work he used his access to follow up on our line of reasoning. If it was a radio and if it was transmitting, DSN had to be involved. He saw nothing to suggest that was the case. If it was a radio and it was transmitting, one would expect something to be receiving. How could he find who was receiving? Finding the receiving radio or signal directly would be almost impossible, but he realized he might be able to find evidence of what was received.

He knew what the code had been doing to Spirit’s images, so on earth he recreated the function calls to ICER and applied the same transformations on about 100 Spirit images randomly picked from across recent months. If he could find any of these recreated thumbnails on any computers he had access to, he had almost assuredly proven there had been a receiver, and likely would be able to determine who it was. The image transformations used to make the thumbnails were so specific that they could be considered unique. By mid-afternoon he’d generated the thumbnails and their corresponding md5 hashes. He now wrote a small application to scan specified directories and their sub-directories for those thumbnails, based on their unique signatures, their “hashes”. He was able to integrate this code with something he’d written a year before to help with forensic analysis of common Windows and UNIX file systems. He would be able to search for matches amongst even some deleted files. Viability of reading deleted files depends on whether the space the deleted files used has been needed and reused by new files. Just an hour or so before the end of his he got a version of his app compiled for the different target computers he would check. He had been given unofficial sudo (admin) access to a few key systems (when he was troubleshooting problems for people), as well as the other systems he had normal user access to; these included some of the NAIF and FLTOPS servers. The jobs would take hours to complete. He left for the day, the jobs still running. We ended up not meeting that night. We were both wiped from the lack of sleep the night before, and his avenue of investigation was better than anything I had in mind.

The next morning I got a call just after 7:00 A.M. One of his jobs, run on one of the servers used to receive downlink imagery before distribution to the teams had gotten six hits. The hits he got were all deleted files, in a temp directory under the home directory of a user he wasn’t familiar with. His sudo access on that computer was what allowed him to find the deleted files, and what allowed him to research the account. Items in the home directory and the user’s shell history pretty conclusively showed the account was owned by someone in JPL’s Quantum Sciences and Technology Group. We now had a bizarrely intriguing who, but the why remained wholly untouched. I could barely focus on my own tasks all day. We would meet up that evening, he was going to quietly recover what he could from that directory and we’d go over his findings. I left a little early, and so did he.

We met at his place, straight out of work. He sat at his computer, and I sat next to him with my laptop. He copied the data he’d archived from his thumb drive. Then he gave it to me and I did the same. He had copied much of what was under that user’s account as well as another account which was part of the same user group. He had recovered quite a few deleted files (though many were just usable as file meta data) under the original account. Most of the recovered files were in the same directory, apparently some sort of temporary staging directory. Those files were consistent with the thumbnail imagery, many of them matched the signatures he’d generated and others we could visually confirm were the same exact transformations of rover images, but these were ones he had not transformed for comparison. I felt somewhat uncomfortable with what we were doing. Up until the point where he had run his jobs and used unauthorized access to scour file-systems and go into other user’s directories, we had been overly curious but no one could fault us for anything we’d actually done. I couldn’t fault him for what he’d done, I likely would have given in to the same temptation. I was clearly a willing accessory before, during, and after the fact. Still, breaking the rules is not something I normally do, and I was deeply uncomfortable.


Our primary goal for the evening was to find out what clues the files in the account might give up as to the mystery radio. We decided to split up the workload. I would look at the meta data associated with the matching recovered imagery to see what I could learn from it and he would look through the other files in the two user directories he copied to see what else might be worth looking at.

My task started simply enough by trying to figure out when these images were actually received. If we could identify the moment they were received that might tell us a great deal about the method used to send them. I began by building a list of the modification times of the deleted thumbnail images we’d matched, then expanding to all the images we’d recovered, and matching those backwards against other images from Spirit. Aside from the modification timestamps, files on most file-systems also have access and creation timestamps but they are rarely useful; they are either not preserved during most copy operations or their use makes them misleading. Timestamps are themselves all stored in terms of UTC (which is Greenwich Mean Time unadjusted for daylight savings time). In theory if Server A received the image from a downlink at 12:34 UTC and these images were transferred to Server B and then on to Server C the modification timestamp should still read 12:34 UTC, the time the file was written to disk. But in theory is not in practice. It would all depend on how the images are being transferred between the servers and how inaccurate any/all of the the server’s clocks were. I knew not to expect too much. Anything I would find would likely only be suggestive, and not conclusive. Nonetheless, what I was looking to tease out of the data was an indication of whether the mystery radio downlink occurred before, after, or during the normal radio transmissions and perhaps an estimate about the bandwidth at which the data was received (as indicated by the deltas of the write times and the known file sizes).

It was a matter of record when downlinks occurred. We knew exactly when the DSN was receiving its transmissions from the rover, directly or via the orbiter; log files unequivocally testified to that. We knew from data contained in the traditional rover transmission what time it was in UTC when it began transmitting from Mars. It was just a matter of comparing all these times and trying to explain what they showed.

The first thing I found was that the timestamps did correspond to the expected reception times via the DSN. And the thumbnail hits we had were received within 25 minutes of when we knew the rover began broadcasting. All of this was perfectly reasonable, Mars was far enough away that a radio signal would take almost 20 minutes to reach earth. And I still hadn’t ruled out server clocks being slightly off, or timestamp changes as the files made their way from reception to where we found them. I couldn’t directly rule out server clocks being off; I didn’t know which servers might have been involved, and I likely wouldn’t have had access to them to check their clocks. I did notice something interesting about the deltas between the mystery thumbnails. Interpolating from the sequence number of the images, the differences in their timestamps, and the file sizes, it looked like the data was being received at about 30 kbps. That number was significant because it roughly fit our expectation of 25 kbps; which was faster than rover’s HGA direct to earth rate, and much slower than its orbiter to earth rate. What was most significant was that it led credence to the idea that the timestamps were the originals, and had not been changed since initially written to disk by whatever was receiving them. I say this because no link in the networks connecting the DSN to the ultimate resting place of these images was 30 kbps, and no image decoding or transformation software that would have operated on the images and possibly changed the timestamps would have operated on them at so slow a rate. The only plausible explanation was that the timestamps were the original timestamps.

Things got more interesting when I looked at the bandwidth interpolation I’d made and carried it backwards to estimate when the first image of the mystery transmission would have been received, after adjusting for rover status and science data. The first mystery thumbnail hit was received roughly 25 minutes after when the rover would have likely begun to broadcast it. But what I hadn’t initially realized was that it would have taken some time to send out the other data and images ahead of it in the queue. As I drew the line on my graph out, it crossed at roughly the t=0 mark, meaning that the first bit of data transmitted would have been received very soon after the rover could have begun transmitting, certainly far sooner than the distance from Mars would have allowed. Clearly some piece of data I had was wrong; the time might be wrong, my bandwidth estimate might be wrong, the transmission and reception times I had found might have been wrong, or our reading of the rover’s mystery radio code had been wrong about the simultaneity of transmission. I’d hit a road block, I didn’t have the information or access I needed to do more.

While I had been working this timestamp data and looking up and reviewing transmission and reception times, Rich had decompiled and analyzed some executables he’d found in the user directory that held the deleted mystery thumbnails. I told him my situation. I was stuck, and I must have made a mistake. He stopped, paused a second or two, turned to me and said, “What do you know about entangled particles?” he asked. “No!” I said, my eyes wide.

Chapter 4: Faster Than Light

Quantum mechanics is profoundly weird stuff. Anyone who claims to understand it completely is a liar, and I’m no liar. My level of knowledge come from a few upper level college courses, what I’ve read or kept up with in popular books, magazines, and the pseudo-information presented in sci-fi shows. I know just enough to impress people at cocktail parties while saying little so wrong as to incense an actual expert who might be listening.

Entangled particles are usually spoken about in the context of what’s called quantum teleportation, also known as entanglement-assisted teleportation. It’s beyond my abilities and free time to try to badly explain quantum mechanics, so I urge anyone who doesn’t know anything about it to read a one page “How Stuff Works” or Wikipedia page on the topic before continuing.

Two particles can have their quantum states entangled such that a change to the quantum state of one creates an instantaneous change in the other. But if conventionally accepted quantum mechanics is to be believed, the word instantaneous in the previous sentence is grossly misleading. While the quantum state is instantaneously transmitted, most in modern physics believe that you cannot actually read the information being passed via the quantum states without decoding it using information which can only be conveyed via another non-instantaneous (light speed limited) channel. It’s a bizarre paradox; the information is instantly there at some remote location, but not instantly interpretable. Conventionally accepted quantum physics would seem to put the kibosh on using quantum teleportation to transmit data faster than light. But what if the conventional thinking was wrong?

This is what I knew of entangled particles when Rich asked me the question. I knew just enough to recognize the significance of his question, with equal parts excitement and revulsion. Believing in faster than light communication via entangled particles was held by most scientists to be roughly as close to lunacy as believing in Bigfoot; it was to invite the scrutiny and derision of your peers. I wasn’t comfortable with that. On the other hand, if entangled particles were at play, the world had suddenly become vastly more interesting, even if my cowardice or discretion meant I would choose never to let anyone else in on what I thought I knew.

“I know.” he said, seeing in my face the desire that this possibility should instantly be disproved. “But look at this, I’m looking through some of the binaries in this directory and look at what’s in their strings table.”

When source code is compiled to make executable binaries the original high level programming language (frequently in aerospace this language is “C”) is converted into machine language. During this conversion all the original, readable source code goes away. But any text within that source code that isn’t itself code or equatable is left undisturbed, aggregated together in something often called a strings table. These individual pieces of text are strings in computer parlance, and they represent things like text for dialog boxes, server names, website addresses, external function or library names, file names, etc.

I peered over his shoulder and between all the normal context-less random drivel of strings meant for dialogs or common library references were suggestive words like, “quantum”, “entangle”, “superpos”, “qubit”, “superdense”. These were all words that clearly fit within the quantum teleportation lexicon. For example, a qubit is a quantum bit, that is a 0 or 1 stored via a particle’s quantum state rather than on magnetic media or on traditional memory.

I asked, “Do you know of any legitimate quantum teleportation research suggesting faster than light is possible? Sure the state changes instantaneously, but from all I read you can’t convert the qubit back into the bit without classical data delivered at light speed.”

He responded, “That’s been my understanding. But, doesn’t mean I haven’t missed something. It’s been years since I studied it. I know there’s been a lot of recent experiments. They’ve been able to separate entangled particles by miles now. I even read about plans for an experiment to emit an entangled particle via a laser aimed at a satellite, and then they’d read the quantum state on the satellite after altering its twin on Earth. Maybe they found a new way to read the qubits.”

We were tired, and I knew I didn’t feel up to the task of trying to mount an attack on any generally accepted laws of quantum mechanics. Instead we spent the next few hours doing the only thing we could, we created some monitoring code based in part on his previous thumbnail hash checking code. This new code could monitor the account where he’d found the deleted mystery thumbnails and the mysterious executables and it would do three things: watch for jobs run by that user, watch for network connections made by jobs run by that user, and watch for files created within that user’s home directory (or any of its sub-directories), including the folder where the previous images had been recovered. All activity would be logged with timestamps, and where possible secret duplicates of files would be made. And when something happened, it would notify Rich. He uploaded and started the process, I headed home.

Chapter 5: This Wasn’t Schrodinger’s Cat

It was just after three in the morning on February 11th, 2009. In just under six hours Earth would rise above the Martian horizon and the HGA would be in LOS with the DSN. If the telecom code was going to trigger the simultaneous mystery radio’s transmissions, that would be the next opportunity.

The exhaustion of the last few days caught up with me and I slept soundly through the rest of the night. Unrelated to all this mystery, I had actually the prior week arranged to take the day off. The car needed servicing, the cat had a vet appointment, and my son had been waiting for weeks for me to take a look at his broken gaming PC. I woke up at 8:30 am and made it to the vet’s for the 9:00 am appointment. I was anxious. Earthrise would occur at 9:03 am, and within five minutes of that Spirit would be high enough in the Martian sky that it would begin transmission via the HGA and possibly the mystery radio. I was sitting in the examination room with my cat waiting for the vet when the clock rolled over to 9:07 am.

My phone rang. It was Rich. The monitoring code had detected activity and notified him. He began, “The images are arriving! I know the server they’re coming from, and it’s definitely one of the Quantum Sciences computers. That’s all I know so far. We’ve got a copy of the process which wrote the files, and they’re still coming in, probably will be for a while. Nothing’s been deleted yet.” We talked for only a minute or two. Rich agreed to encrypt the log and send it to me so I could look at it while I was stuck at the dealership. If something else came up he’d let me know. The tedium of the vet visit was made far more unbearable than usual by the great desire to be exploring this mystery. Once the cat got his prescription we headed on to the dealership. I’d be stuck there for the next two hours, sitting in their lounge, the cat beside me in his carrier. I’d brought my Asus Eee PC netbook, so I pulled it out and got online via my tethered phone. Rich’s email was waiting for me. The subject, “Very interesting dump file”, with a body that said only, “See what you think.”

Rich is a security freak. Most of what I’ve learned has been a result of his paranoia, curiosity, coding, and resourcefulness. I mean to one day share the approach he taught me for anonymously accessing the web. I’ve seen no more exhaustive approach described online. He takes no shortcuts with security. Maybe he could, maybe you’d be almost as good, but you couldn’t be better.

What he sent me really was a dump file, just as the subject of the message said, but it was a dump file with encrypted data hidden inside it. He had made a little utility for inserting payloads into and removing payloads from otherwise legitimate dump files. It would literally hide the payload so that it was a valid region of memory inside the dump file. Dump and debugging utilities would see the dump files as entirely legitimate and they could be opened and investigated. The encrypted regions would however not be meaningful to anyone without the decryption, but neither would they raise any flags either; the encryption made them appear like random uninitialized regions of memory. Rich is big into steganography, hiding data inside other data. There’s no better way to protect your secrets than by preventing people from knowing you have secrets.

I extracted the encrypted log file from the dump file, decrypted it, and sure enough the log appeared to confirm that the server was seeing these files added just a few minutes after Earth rise on Mars. This was at almost the exact instant the HGA was scheduled to begin its transmission. The delta between our script logging an image being added and the modification time of that image, apparently preserved from wherever it originated after being received from the DSN, was under a minute. Other entries in the log file, indicating processes and servers involved I could do little about from the dealership’s lounge. The servers implicated were all recorded as private IPs, and from this netbook I had no access that would allow me to reverse them remotely. But Rich had said at least one of them was operated by Quantum Sciences.

I spent the next hour doing the only thing I practically could, I read everything I could find about current opinions and experiments related to quantum teleportation and faster than light (also known as superluminal) communication. It made for a very frustrating beginning to what would become a very frustrating afternoon. I have a strong aversion to ambiguity, and I would have my fill.

I was able to confirm that most quantum physicists still believed the quantum state was instantaneously transferred but that the point was made moot by the inability to determine the transfer had occurred except via classical data communicated at light speed. But as hard as I looked I could find no experiments which seemed to directly test this assumption, let alone prove it. With my limited knowledge I couldn’t even begin to suggest how one might construct such an experiment.

I did find quite a few arguments by scientists both professional and amateur that supported the possibility of superluminal data transfer not only by quantum entanglement but also by gravity waves, worm holes, higher dimensions, and a few other means. Quite a vocal minority of scientists argued that the transactional interpretation of quantum mechanics, a fascinating attempt to reframe the understanding of quantum interactions as the interaction between forward-in-time and backward-in-time waves, suggested a means by which superluminal communication might be allowed; it should be noted that the developer of the transactional interpretation of quantum mechanics did not himself believe this approach made any altered statements about the viability of superluminal communication.

One of the strongest arguments in favor of the possibility of faster than light communication I found was that while most scientists hold to the “No-Communication Theorem” that describes the impossibility of faster than light information transmission, some scientists argue that some experiments seem to show that the No-Communication Theorem is not a limiting factor in the speed of communicating information. There may still be a limitation, but the No-Communication Theorem isn’t the explanation. It was the first seemingly widely accepted glimmer of hope I’d seen, it wasn’t proof that it could be done, but it was one less rule which might forbid it.

The research continued throughout the rest of the day from the comforts of home, after an hour spent diagnosing my son’s computer and replacing my son’s sound card with a spare. I found nothing conclusive on the superluminal front, just vague and hopeful belief on one side and dogmatic and disdainful disbelief on the other.

Chapter 6: Sniffing the Qubits

I got to Rich’s place about 8 pm. He had nothing new to report, having spent the entire day playing catch up with his actual work. He brought home his thumb drive with the data he’d gathered. He had the rest of the log he’d sent me, the actual binaries of the processes the log had flagged, the server names he’d gotten by reversing the logged server IPs, traceroutes to those servers, and the images/files created under the user’s directory. He copied the contents to his computer and then I to mine. Our goal tonight was to try to understand where these images were coming from, and what they might be being used for.

The process which wrote the received images to disk had only one network connection, and that was, as Rich said, to a computer within JPL’s Quantum Sciences and Technology Group. It wasn’t possible for us to tell to whom exactly the computer belonged or where it was physically located. We didn’t have access to the routers required to trace the computer to an individual office, and we didn’t have access to the netops database that would tie the computer’s assigned name to an individual owner. That was as far as we could go on that issue for now.

Rich had decompiled and was examining the process which had written the mystery files. The process itself looked simple enough, it acted as a server waiting on a port for a client to connect to it. The only thing the server appeared to do was receive the files from the client and write them to disk. There was an SSL library employed to ensure data transfer was encrypted, and there appeared to be some basic user account authentication, but the code looked so simple we couldn’t quite understand why it had been done at all, rather than transferring the files using any of the standard secure UNIX command line utilities (e.g., scp or rsync over ssh); we would never know why. I simply looked over his shoulder and added as many insightful comments as I could.

Rich now turned his investigation towards a second executable process which had ultimately been responsible for deleting the mystery imagery after receiving connections from two computers within the internal network and after connecting to the same Quantum Sciences and Technology Group server involved in writing the files. I tried to see what I could learn about these two new computers. One appeared to a workstation in the Network Operations Control Center (NOCC) of the Space Flight Operations Facility (SFOF), building 230; it’s like Houston’s Mission Control Center, the one you’ve probably seen depicted in movies. The other computer’s name and traceroute didn’t betray anything about its location, other than it being assigned to MER and being somewhere in the same building. Rich decompiled this second executable and we made sense of what the code did. It didn’t take long for us to see that it was little more than a backend server acting in part as a remote photo gallery and data browser. Clients could connect to it securely, they could view the photos, they could delete photos, they could perform some additional as yet unknown operations on the photos, and they could download the science data and rover status. We were most curious about the nature of those additional unknown image operations, but we wouldn’t learn more until the next night.

We had learned about as much as we could with the information we had. We needed more data. We wanted to know exactly what communication was going on between this image server and the three computers with which it communicated. We needed to eavesdrop, and a regular packet sniffer wouldn’t work. The problem was that inbound and outbound connections were encrypted. This is where Rich’s deep security knowledge saved the day. He proposed we execute a man-in-the-middle attack on the server, an approach by which you insert yourself between two parties just as they are about to introduce themselves, and then pretend to each party that you are the other, repeating everything the other person says to his intended recipient, word for word, while recording all of it. The technical details are too complicated to get into here, and I’d probably get them wrong, but the basic idea is that instead of a client connecting to the server it intends, it instead connects to a spoof server listening at the original address (and port). This spoof server then acts as a relay, connecting to the real server and passing data back and forth from client to server as though they were talking directly to each other. All data is still encrypted for protection from the outside world, but this man-in-the-middle is able to read and record everything unencrypted. The actual attack is brutally simple, the security used for the original client/server was the web standard, SSL, and he already had man-in-the-middle-attack code he’d downloaded from some hacking site that would handle the grunt work involved. He modified the original server binary to listen on a new port and set up the attack code to listen on the original port, passing the connections through and on to the original server at the new port. He put this code in place, killed the original process, launched this modified version and the spoof server, and we called it a night. Tomorrow we should have more data to go on.

I should note, Rich is forever proving his genius by his thoroughness. One such example was that he put in a safeguard to try to protect us from detection. There was always the chance, however slight, that the man-in-the-middle attack might in some way interfere with the client-server connections. We couldn’t test this, not having the client software we’d need to conduct the tests. If there was a problem the first thing most admins would do would be to restart the server. If they restarted it and it continued to have problems, they would likely investigate, and that would be disastrous. Our modification would be easily seen, and the change so inexplicable as to make the purpose instantly clear. To protect us, Rich installed a script that would constantly check to make sure the server software was still running. If anyone shut it down Rich’s script would instantly kill our spoof server and restore their original binary. This particular safe guard wasn’t tripped, but there have been plenty of times when my work was salvaged by just such thoroughness on his part.

Martian days are a little longer than Earth days. It would be about 40 minutes later on Earth when Spirit would see us rise above the Martian horizon. So at about quarter to ten in the morning our man-in-the-middle attack began. I got a simple text from Rich at a few minutes after ten. “we have data”. I was in meetings all morning so there was little I could do and no contact I could discretely make to see what he’d found. We met a few hours later for a quick lunch. He’d not had a chance to look hard at the data, he’d only confirmed the conversations between computers were being recorded. Work kept us both too busy to do any investigation during the day. We couldn’t get together that night. It was agonizing, but the mystery would have to wait. My wife had been complaining about my recent absence. Rich was behind on one of his projects, and the next day, Friday was a big deadline, and he had to make it. We agreed to get together Friday evening.

Chapter 7: Analysis of a Packet Dump

I’m not sure whether our luck could be considered good or bad on this particular Friday the 13th. By the end of the day I would come to a completely new and profound emotional understanding of the scene in the Matrix where Neo has to make a choice about which pill he takes, which reality he chooses. I realized with alarm how tempted I would be to take the blue pill, to just go back to life as I knew it. I thought I was better than that.

The work day was uneventful. I worked. Rich had stayed at work late the night before, and he’d met his deadline. My wife and son were getting together with her sister and his cousins that evening, and I was able to make my way to Rich’s by 8:00 pm.

The true beauty of the off-the-shelf man-in-the-middle attack code Rich had used is that it dumped all the network traffic in a standard format that many applications could read, the libpcap format. We could easily explore the data with the open source packet sniffing software Wireshark. What otherwise would be a messy conversation was not easily followed, searched, filtered, etc.

Conversations between clients and servers are not totally unlike human conversations; though admittedly more like a conversation ordering a meal at a drive-through window than an existential conversation with a long time friend.

This conversation started with the server (we began to call this server the “data server”) acknowledging the client’s (we adopted the name “data reviewer” for this client) connection. The data reviewer introduced itself and supplied a username and password. The data server acknowledged that the user was now logged in. The data reviewer now requested the data corresponding to Spirit’s current sol (sol 1818). The rover can actually hold over a month’s worth of limited data (though this is not done), so the request was in terms of sol of transmission, not when the data (imagery, science data, system data) was collected. The transmission was being made very early on the Martial morning of sol 1818, the data being primarily collected on sol 1817. The server responded by sending the client an XML document listing what assets were available. The imagery could include thumbnail imagery from each of the five on-board cameras (forward and rear hazcams, navcam, pancam, and microscopic imager). The client then requested and was given each asset in turn, the imagery, the science data, the system status. In the midst of this conversation the server began another similar conversation with the second data reviewer we’d logged. A similar pattern was observed then in both clients, after receiving all the data from the data server each kept the connection idle for almost 4-5 minutes. Then one of the data reviewers began to issue a series of commands. There were a handful of “REMOTE_DELETE [ASSET ID]”, several “REMOTE_TRANSFORM [ASSET ID] [UNKNOWN BINARY DATA]”, and at the end of the conversation a “DELETE_ALL”, followed by a “CLOSE”.

A few interesting things occurred as a result of each command issued. With each of the REMOTE commands, a connection was made to the Quantum Sciences Technology Group server we had concluded had sent the data to the data server (via the separate process I mentioned earlier); we called the server which had sent this data the “radio server”, since it was as close as we could get to whatever was receiving the signals fro the rover. These connections from data server to radio server consisted only of relaying the REMOTE command sent to the data server, followed by the radio server confirming action with an ACK. Possibly unrelated to these other REMOTE commands, we also saw the data server connect once to the radio server and issue a “STATE” command, which caused the radio server to respond with what looked to be a large chunk of compressed or encrypted data; we would need to try to decrypt or decompress and investigate that data later. The last thing we noticed was that almost immediately after the DELETE_ALL was received, all the data temporarily housed on the data server was deleted. And when a data reviewer sent the CLOSE, they not surprisingly disconnected.

We captured all the credentials necessary to connect to data server and radio server, but connecting to them was a source of a anxiety, until and unless we felt confident that we could get the protocol exactly right. If we got it even slightly wrong the servers would very likely log the event and might throw an exception that a developer may take notice of.

The credentials involved weren’t recognizable as names of people, so we still had no better sense of which people within JPL were involved.

The night ended with us investigating the STATUS response the radio server had sent. The data was merely bzip2 compressed. Rich already knew this, as the data began with “BZh” which are the magic bytes indicating bzip2 data. I felt like an idiot for not immediately seeing this; in my defense, I really wasn’t at my best, I’m notoriously dysfunctional when I’m even slightly short on sleep, and these last days had been taxing me.

The status data turned out to be Spirit’s status, a wide range of vital data including its health indicators, operational parameters, and predicted location. Included in the data was a timestamp which seemed potentially interesting because it appeared to be from Spirit’s internal clock, and the difference between this listed time and our local time was less than a minute. Spirit was over 19 light minutes away.

And with that last discovery, just minutes before 1:00 am, we called it a night. Tomorrow was Saturday and we’d get together for a few hours in the afternoon to see where we went from here.

Chapter 8: Testing Our Theory

I spent the morning and early afternoon doing family things, feeling frustratingly disconnected. It seems such an unkindness to spend time with one person wanting to be somewhere else doing something else. They were kind enough not to notice or at least say anything. I suppose my distraction was nothing new, I was often absorbed in my work. I was always dimly aware of it, and always tried in my own way to make up for it in moments where I wasn’t distracted. I made the same vow this time.

Rich and I met at a little later than expected, roughly 4:30 pm. We only had about two hours free before each of us had important Valentine’s Day obligations.

Every day seemed to bring new questions and few answers. We were making new discoveries, but few seemed to resolve into some definite conclusion; instead each mystery would lead down an avenue with a roadblock preventing us from further exploration.

We reviewed what we felt confident we knew. Some mystery radio on Spirit was able to transmit to earth in a fraction of the time it would have taken for a radio signal to reach earth. The radio had a modest bandwidth. The data being transmitted by this radio included rover status, science data, and thumbnails of current imagery. The data quickly made its way from radio server to data server where two data reviewers viewed the data, seemed to select some of these images for remote deletion, selected others for unknown transformations, and then deleted all the temporarily downloaded data. A query sent to the radio server appeared to be able to request and almost immediately get (at speeds faster than light) a status report from the rover.

The most pressing of issues we might be able to make progress on related to confirming these status reports were indeed coming live Spirit, understanding the reason why some assets were being remotely deleted, and understanding what the transformations on those assets represented.

The first item we wanted to know about was whether or not the status messages from the radio server really represented the nearly real-time status of the rover. It seemed quite likely they were instead from an Earth-based service regurgitating information from time-delayed status reports. We noticed that included in these status messages was information about the current sequence being executed by the rover. If we could get a series of these status reports during a time when we knew the rover would be executing uplinked sequences we could subsequently compare the timings we observed with the data the rover would later downlink about the timing of its activity. If the timings matched fairly closely, this would prove the status reports were in fact live and nearly instantaneously delivered from Spirit.

It was almost Martian noon at Spirit’s location, and we were able to fairly quickly learn from RP preliminary reports what activities Spirit would be up to for the next few hours. Now we just needed to write a job to capture the status reports.

Based on the protocol we recreated from the libcap dump, Rich and I wrote a little job that would connect to the radio server as though it was the data server (and coming from the same IP), query status, collect the response, and disconnect. We must have walked the code 15 times before feeling confident enough to actually run it, terrified we might screw up and be detected. With nothing left to do, we ran the job. To our great relief it appeared to work perfectly, dumping and unbzipping the data to standard io (the screen). Rich set up a job to run that code every 5 minutes, encrypting and storing the results for later review. We broke for the day. Rich would let that job run for the next 12 hours.

As a side note, Rich was once again careful, he included in the code a check to make sure before connecting to the radio server that there were no existing connections to it or to the data server from the data reviewers. He did not wish to use up bandwidth they might notice was missing. This didn’t guarantee someone else, somewhere else might not notice, particularly those operating the mystery radio itself. But we could do nothing to protect ourselves on that front.

Sunday we were getting together at 5 pm to review what he’d collected. Rich was very religious, and he spent most of his Sundays involved in church related activities. His religiousness was a perpetual fascination to me because we had similar views on almost every other social, scientific, philosophical issue. On the issue of religion we were worlds apart. I always suspected his devotion was a great coping and motivational mechanism; it clearly brought a great deal to his life, but my own bias made it very hard for me to take his belief entirely seriously. He held some fairly radically religious views which didn’t seem supported by the same scientific logic or even common sense that he applied in every other area of his life. At any rate, I was never able to learn the exact nature of his full beliefs, let alone try to change them (which wouldn’t have been my intent anyway).

It didn’t take long before we were able to convince ourselves that the status messages the radio server were supplying us truly were live and direct from Spirit. The logged sequence execution times matched the sequence changes we’d seen in the status, with timings that made it clear light speed wasn’t delaying the data in the status messages.

Chapter 9: Remote Control

We now turned our attention to the REMOTE commands issued by the data reviewer to the data server. On the face of it previewing images via thumbnails and deleting those you don’t like is not unusual. Bandwidth was limited and not all images are worth sending to earth. And while we couldn’t tell what the remote transform operations did, it seemed plausible they might be used to adjust lighting, coloring, etc. to see if the images could be improved before delivery to Earth. But what was peculiar about this arrangement is that we’d seen nothing in the code that would seem to have allowed a remote delete or transform to have modified the images before they were actually transmitted. In other words, the images would have already been en route to Earth or at least irrevocably committed to being sent to Earth at the time the remote delete or transform would be issued. These commands wouldn’t have saved bandwidth. This raised a peculiar possibility. If the commands were not issued to save bandwidth, why were they sent? We knew the images for which the REMOTE_DELETE had been issued were not in the available image archive or even listed in the logs of images received, so how had they been removed? They would have needed to be intercepted after being received by DSN, and at that point, why bother? If there was something so obviously wrong with the construction of the image, as happens from time to time, why not just let the data team discover and disregard them as part of their EDR assembly from DSN. And maybe that is what they were still doing, but then why all the mystery?

We began to wonder specifically about the transforms. There was no way we could decode the actual binary instruction passed with the transform command. We didn’t have access to either the data reviewer software or the radio server software, so we had no code to decompile which might reveal what the data meant. Presumably the data indicated a type of transformation to make and some parameters for the transformation requested, but we’d never be able to tell just what without access we didn’t have. That said, if they were indicating a type of transformation, we might be able to detect that by comparing the thumbnail to the EDR and looking for obvious changes, then looking for commonalities in the code being passed. As we began to look through the EDRs for those which we knew had transforms requested, we noticed that all the transforms were requested on images for which the EDRs had missing data.

The wavelet-based ICER algorithm used on the rover to compress the Mars imagery controls for data errors by breaking each image up into many rectangular regions. When data within that region is damaged that region cannot be rendered but other parts of the image still can. These unrenderable regions appear as black rectangles in the EDR images; often one image will contain many such unrenderable regions, forming larger black connected regions. That is how ICER is supposed to work, but what made little sense to us was that the transforms were requested before the images arrived corrupted, so they couldn’t represent an attempt to correct images which weren’t yet found to be corrupt. The thumbnails were fine, the thumbnails had been made from the images stored in the rover’s flash memory, so the original images must have been fine before transmission. The only conclusion we could come to was that the transforms were causing the corruption. Given what we had realized about the remote deletions, these corruptions must be introduced here on earth, not as the result of transient interference of the signal on their way to Earth. The corruptions must be occurring after the signal was received by the DSN but before it was made generally available to the scientists or the archive. Why would someone be trying to effectively remove data from these images? We stared at the related thumbnails for quite some time without seeing anything absolutely unusual; one did have a mildly interesting rock that was obliterated by the artificially induced blackness, and another had a region of disrupted soil that seemed interesting. Nothing, though, that appeared to us to be of particular scientific importance. But, the thumbnails showed so much less detail than the originals that we began to feel that perhaps you would need to have known what you were looking for in order to have found anything in them. Rich made a comment at the time that stuck with me, “If you or I found an igneous rock in a field of sedimentary rocks, we’d think nothing of it except to think, that rock looks mildly more interesting than the others. But a geologist seeing that exact same scene would be able to read into it the unusual and perhaps unlikely history of that place.”

Rich would leave his man-in-the-middle attack in place for several weeks, gathering more and more data. And I will briefly jump ahead in time to say that we ultimately found evidence of a second transform. We were able to work out that the first byte in the transform binary data was a type specifier indicating the kind of transform to perform. All the original ones we’d seen with the missing data were of the “E” type. We began to call that type the “error transform”, since it seemed to introduce errors and began with the letter “E”. The new type we found was identified by the letter “M”, and we began to call it the “merge transform”. By comparing the mystery thumbnail to the EDRs ultimately archived we saw that this merge transform involved placing new image data on top of the original image, replacing or obscuring content. The details of the discovery of this merge transform were interesting but will need to be withheld; I haven’t thought of a way to tell that part of the story without hiding Rich’s identify with implausible lies.

We ended that Sunday feeling confident that we were getting much closer to solving the mystery. We felt if we could just collect enough data the nature of what was being censored would become obvious. And perhaps the censors would become indirectly obvious as well, once we understood their motivation. Letting the man-in-the-middle attack continue to run should have provided us the data we needed.

Meanwhile, we could continue to pursue the remaining mysteries by other means.

Chapter 10: In Search of a Who and a Why

The mysteries that were important to us had shifted. Initially I deeply wanted to know whether or not quantum teleportation was real, wanted to find proof that it was scientifically possible. Now it hardly seemed to matter. I had accepted that something was allowing faster than light communication and whether it was quantum teleportation or a fairy’s pixie dust no longer seemed critical. The issue of who was using this faster than light radio and why were they using it to apparently censor rover images, those were the only questions that seemed truly relevant any more.

We got nowhere finding a who by trying to tracing the radio server or data reviewers. I tried narrowing down the location of the radio servers through traceroutes and comparing that with other servers I knew were operated by the Quantum Sciences and Technology Group, but all the computers were on the same network segment, and my tests proved fruitless. Similarly, attempts to locate the exact computer acting as data reviewer within NOCC proved fruitless. I did learn something puzzling about it, though. I had an opportunity to be in the NOCC on non-MER business a few times with a laptop from the project I was working on. On two of those occasions I was able to plug in to the network and discretely run my copy of Wireshark for a few minutes to capture network packets for later inspection. I was interested in seeing if I could find any packets from the mysterious data reviewer; not packets destined for the data server, but just unrelated packets to or from the data reviewer’s IP. Both times I did found packets from it. But what struck me as very curious was that the packets in the first session originated from a different MAC (network card) address than in the second session. It could have simply been that the IP was dynamically assigned and another computer had been assigned the IP, but that seemed unlikely. Another possibility I considered was that perhaps the data reviewer was run from inside a virtual machine stored securely on a flash drive, and could thus be moved easily from computer to computer. I was never able to find out which was the case.

We discovered something alarming about the other data reviewer. Rich tried and was able to connect to the computer from which the connection had come, and was ultimately able to get root (administrator access) on it, having found a buffer overflow exploit that no one had bothered to patch; the server itself appeared largely unused, and was running an OS that was at least 8 years out of date. After a few minutes checking log files Rich discovered that the data reviewer connection did not originate from this server, but was instead being relayed (tunneled through this server via SSH). The originating connection was from an IP block reserved for the Department of Defense. A traceroute led to Arlington, Virginia, just outside Washington, DC. The fact that someone at the DoD was involved in reviewing and possibly censoring images from Mars was profoundly strange. Rich and I had still seen anything so anomalous in the thumbnails that we could explain why any of them would be of concern to anyone. And while we knew there was talk of NASA and the DoD becoming increasingly friendly in support of a new space race, we hadn’t imagined it had already happened, or that it would happen like this. We were starting to get very spooked.

Every few days Rich would give me updates on what the man-in-the-middle attack was collecting. It was easy to stay on top of it because relatively few images were being deleted or transformed. We’d both take a look at the images independently, fail to see whatever the censors were seeing, and wonder what we were missing. Some interesting features were being obliterated, but none in miniature seemed that interesting. On a few occasions he and I would get together to review the images, and on a few occasions we did get into arguments, each of us at different times arguing that this or that feature was surely the offending element. We would try to convince the other that this or that feature was suspicious, but never once were we both fully agreed on the same image. It just never really seemed to us any more than pareidolia; seeing shapes you recognize in clouds and other random data.

After some weeks of this slowed progress. We’d temporarily abandoned our fruitless search for a who. We now focused only on the casual attempt to understand the why, as new censored images came in.

Chapter 11: Suspicion Falls

Early one Monday morning I got a cryptic voice mail from Rich. Rich usually gets into work before I’m even awake. When I woke I saw I missed his call (my ringer had been off), and checked my messages. Rick said nothing more than, “Call me.” but there was something in his tone that made me anxious. I called him right back. He said he couldn’t talk, but that we needed to meet for lunch. His tone sounded better, but a little forced.

At a little after 1 pm we met at one of our regular spots, President Two, a Thai place near JPL. I was already seated when he approached looking a little worried.

“Someone tampered with the server in my office late last night. I got in this morning and it had rebooted. I checked the logs and the entries stopped abruptly at a little after 2:00 am. The server didn’t come back up until nearly 5:30 am. I think they cloned my drive. The server was moved, and one of the screws on the case was now cross-threaded. I opened up the case and sure enough, I noticed someone had switched the power cables connected to the hard disk. A few months ago I replaced that wonky power supply and by mistake I hooked the drive up to one of the shorter power leads. I didn’t realize it until I was reassembling the case and had to stuff all the other cables back inside. That power cable was stretched tight enough that it a pain in the ass to stuff in the cables, but I was in a hurry and figured I’d fix it later. But, I never did. When I opened up the case today, one of the long cables was connected to the drive. I checked with Mitchell and Oliver next door, before I realized my case had been opened, and both their desktops and the desktops their graduate students were using had been rebooted. Initially I just assumed there had been a power issue in the building, but then I noticed that clock of mine was still telling the right time.”

“What could they have gotten?” I asked. This was the server from which we’d been connecting to the other servers. He had taken plenty of precautions, including deleting entries in the log files of computers he connected to and misdirection by connecting via other shared servers using shared root accounts.

“Nothing, I hope. The real OS and file system is on the hidden and encrypted partition. They’ll just see a basic Linux install and a whole lot of unused disk space.”

“What about logs elsewhere?” I asked.

“I was careful, but obviously someone had some reason to look at a bunch of computers on this segment of the network. Maybe they looked at a routing table and found something cached that pointed suggestively back to this segment. Maybe I missed a log, or forgot to delete something. Am I just being paranoid here? Have you ever heard of ISAS sending netops or sysadmins to service our computers like this? I mean with no prior notice, and super early on a week day?”

“I don’t know. If you’re sure your case was opened and your drive disconnected then that seems like proof it wasn’t them. They definitely wouldn’t have done that. I guess there’s no way you can discretely inquire to ISAS? or maybe at least find out how many people’s computers were involved?”

“I asked a few other people down the hall, right before coming over here, and their computers had been rebooted, too. No idea if their cases were opened or how long their computers had been down. But from what they said, and the people I talked to, it’s at least 10 people who were affected, probably at least a few more. I did stop by Rodger’s and his computer hadn’t rebooted, and I remembered he isn’t on the same network segment, because last year when we had that router failure, I remember his connection was working fine.”

“So, what do we do now?” I asked.

“Nothing.” he replied. “Absolutely nothing. Let’s just see if anything comes of all this. Maybe they’ll make some announcement, or someone will say something. Maybe I am just being paranoid. Maybe it’s part of some ISAS audit.”

“Maybe, but they wouldn’t have opened your case or touched your hard drive.” I said.

“I know.” he said.

We left it there, and tried to distract ourselves with normal conversation for the rest of the meal.

I double checked my laptop when I got back from lunch and found no reboots. I made absolutely sure that none of the mystery files or any tangentially related files I’d created could be found. They were all safely on my hidden, encrypted partition. To be extra safe I ran a program that wipes slack space and deleted files.

Chapter 12: Paranoia Runs Deep

For the next week we did nothing and said nothing about any of this stuff. I was so paranoid and disturbed by Rich’s experience that I even avoided viewing the new public Martian imagery, which I used to enjoy looking at during lunches at my desk. I was afraid that somehow my particular pattern of clicking the images might act as unique fingerprint they’d notice from the week before and suspicion might then lead them to me.

Nothing was announced about the reboots. Nobody seemed to regard the matter with any interest. Most of the machines affected had been Windows machines and I’m guessing everyone dismissed the event as necessary sysadmin initiated patches, or perhaps some temporary localized power outage.

The following Monday was February 23rd. I got in at my usual time. Rich called me a few minutes after I got in.

He started, “Mitchell’s house was broken into while he and his wife were at work on Friday. Someone broke in through his bedroom window on the first floor and made off with Mitchell’s new laptop, his wife’s laptop, an old laptop he kept in the living room for MAME, and two spindles of DVDs that contained backups of his laptops. Mitchell said the cops took fingerprints and would check with the local pawn shops, but that he shouldn’t realistically expect any good news. The police theory seemed to think it was a drug user or homeless person who broke in. The thief ignored more valuable items in the house, including jewelry and cash on top of the dresser a few feet away from where the thief climbed in.”

Rich was obviously viewing Mitchell’s story through the paranoid lens of wondering if the same people who cloned all our computers now directed their suspicions fully on Mitchell and had stolen his computers in order to gather more evidence.

We were both deeply uncomfortable, not sure how much blame to accept or reject. And we were terrified we’d be next. That night in a fit of anxiety I wiped the hidden partition that still held my copies of the items Rich had given me, as well as some of my own notes. He still had the original data, and copies of some of my notes. There was little point in my keeping these copies if it was just going to drive me crazy with worry.

I bumped into Mitchell later in the week as he was leaving a meeting and told him how sorry I was to hear about the robbery. He said thanks, but said that by comparison the robbery was the better part of his week.

A project Mitchell had been championing for years, that had finally been approved for funding less than six months before, had just been unceremoniously canceled. The reasons cited were cutbacks related to the economy and the belief that some of his project’s mission goals could rolled into future ESA and JAXA missions. It was a plausible explanation. Budgets had shrunk, and ESA and JAXA had missions that might support some of the science his project was to do, but it was highly unusual for a project that had already been given its funding to be shut down in this way.

Within a few months Mitchell had retired. The cancellation of this project effectively forced him out; at his age, and with the opposition he now felt was working quietly against him, he knew there was nowhere left for his career to go. He knew the best days were well behind him.

Rich and I felt awful. We couldn’t prove it, but we were deeply suspicious that Mitchell was being blamed for something we had done, that the message being sent to Mitchell had been intended for us. We got the message, if indirectly.

Chapter 13: The End

The next few weeks were terrible. Rich and I had begun to talk less and less. I think our guilt over Mitchell, combined with our unspoken belief that one or both of us was under the same umbrella of suspicion, made us feel very uncomfortable associating with each other.

So without even discussing it with Rich, I decided to leave JPL. I left not long after Mitchell. I saw that the best days of my career were also behind me, that forces within JPL, or perhaps just within myself, were going to prevent me from ever seeing my job or my work the same way. And I no longer knew what our mission really was, no longer felt I knew who I was really working for. I looked for a way to get out. I lined up some private consulting gigs, and I gave my notice.

Life outside required readjustment, but it went smoothly enough. I still interacted on a daily basis with many of the same types of people, if not the same people. Everything was the same, yet everything was different; such is life.

The awakening that began with the discoveries Rich and I made only intensified when I left JPL. I became more and more interested in trying to understand what we’d learned, more and more interested in revisiting older odd events that I’d previously been too eager to dismiss. And my research and exploration has brought me here, to publish my experience publicly, hoping that others may help fill in the gaps I can’t. Because, since exiting NASA I realize my journey has only just begun.

What pieces of this puzzle do you have?

© 2010. All Rights Reserved. https://astroengineer.wordpress.com

Share this:


Probing the Mystery of Spirit’s CensorsIn “General”

The Feynman ConstantIn “General”

The Things that Matter and the Things that Don’tIn “General”

This entry was posted on Wednesday, April 7th, 2010 at 6:07 pm and is filed under General. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

Post navigation

« Previous PostNext Post »

79 Responses to A Curiosity of Spirit (FULL DOCUMENT)

Feb 11

OpenVPN ConfigQuery

Just a quick note on setting the security for OpenVPN. To disable the older cyphers you can use QueryConfig in the command line. For our AWS Ec2 instance I simply used this line:

/usr/local/openvpn_as/scripts/sacli --key "cs.openssl_ciphersuites" --value 'EECDH+CHACHA20:EECDH+AES128:EECDH+AES256:!RSA:!3DES:!MD5:!RC4' ConfigPut

Then confirm with:

/usr/local/openvpn_as/scripts/sacli ConfigQuery | grep "cipher"

Finally restart the service:

service openvpnas restart

Oct 18

Denyhosts on CentOS

In general, I install fail2ban and denyhosts on all of my external linux servers that have port 22 open.  This is generally only because sftp is also installed on these systems because marketing people don’t know any other options like S3 on AWS.

I want to point out the files that need to be looked at before you enable the deamon mode via: sudo service denyhosts start.

First is the config located at /etc/denyhosts.conf.  That tells it to look at your /var/log/secure and update the /etc/host.deny file among other things

If you have IP’s or hosts that need to be whitelisted, you need to add them to a file that belongs to denyhosts.  It’s at: /var/lib/denyhosts/allowed-hosts

Once you start the service it will list all of the hosts that will be denied, verify that the list doesn’t include anything that matters to you.  If it does then you need to stop the service and delete those entries from /etc/hosts.deny and add them to /var/lib/denyhosts/allowed-hosts

Then enable it at startup with: sudo chkconfig denyhosts on

And if you on systemctl then take a look here for startup: http://digitalsos.com/?p=45


Mar 22

netdom join Error 53, Error 2, DNS is correct. Windows 2012 R2 doesn’t like /ou

We have been using a script for years that will join a Windows system to our domain.  Now with 2012 R2 it never executed.  And this was not a DNS issue.  Ping your DC’s with the friendly name from the system first and if they resolve you are good.

At first I thought was an issue with our 2012 R2 domain controllers.  Apparently after researching this I saw that a duplicate SPN check can cause this.  https://support.microsoft.com/en-us/kb/3070083

Patch your DC’s this hotfix was incorporated into later patches so if your up-to-date then you should be fine.

Remove the /ou section of your netdom join statement.  It worked for us for years, but now it just throws a error every time.  And before you say anything I tried using CN for the computers section most of the time.  Our statement was:

netdom join $serverName /d:$Domain /ou:”OU=Computers,DC=cloud,DC=digitalsos,DC=com” /ud:SOS\joinUS /pd:$decrypted /reboot:20 >> $logfile

Now it’s just:

netdom join $serverName /d:$Domain /ud:joinUS /pd:$decrypted /reboot:20 >> $logfile

Hopefully you won’t throw away a week of your time chasing this down.

Feb 04

JDBC connector to MSSQL 2014

YAFD (Yet another fucking developer) that needs to use java to connect to YAFDB (Yet Another Fucking DataBase).

So start off and install Oracle java JRE 8 on your sql server.  Then go here and get the latest Microsoft JDBC Driver, 4.2 http://www.microsoft.com/en-us/download/details.aspx?id=11774

Go to your control Panel –> System and Security –> System. Click on the left side “Advanced system settings”.  Click Environmental Variables.  Click on NEW in the System variables and for the name use CLASSPATH.  And if you extracted the zip correctly you can use this for the value: C:\Program Files\Microsoft JDBC Driver 4.2 for SQL Server\sqljdbc_4.2\enu\sqljdbc42.jar


So now we need to build the connection string as outlined here: https://msdn.microsoft.com/en-us/library/ms378428%28v=sql.110%29.aspx


Feb 04

Ubuntu SNMP config for Zabbix and Checkpoint

I’m setting up SNMP monitoring for our Checkpoint devices in AWS and Zabbix needs the SNMP client configured.  A good tutorial is located here:

But adding the templates for checkpoint was more involved than I thought.  I grabbed the templates here: https://share.zabbix.com/network-appliances/checkpoint-fw-1-hardware. Then created the mapping by going to Administrator –> General –> On the far right pull down to Value Mappings, and create new.

Then added the discovery scripts with: sudo cp advsnmp.discovery /usr/lib/zabbix/externalscripts/.

Then got the checkpoint mib file and added it to /usr/share/snmp/mibs/.  As long as you commented out the mib line in /etc/snmp/snmp.conf then you should be able to run snmptranslate -m +CHECKPOINT-MIB -IR -On memFreeReal64.0 and get an accurate translation.

The real test is to snmpwalk from the Zabbix server, for SNMPv3 use the following: snmpwalk -v3 -u UserName -l authPriv -a MD5 -A UserPassword -x DES -X EncryptionPassword memActiveReal64.0

Finally, and I’m not sure if this helped but I exported the MIb via: export MIB=+CHECKPOINT-MIB

Some useful web links: http://www.net-snmp.org/wiki/index.php/TUT:Using_and_loading_MIBS

And the net-snmp FAQ is really good: http://net-snmp.sourceforge.net/docs/FAQ.html#How_do_I_add_a_MIB_to_the_tools_

Oct 30

Build bitcoind from source Fedora 22

Disclaimer – this does NOT work.  It’s close, but no love.

As usual there are no good instructions on the net to do this.

First get the source, I already had git installed and I’m actually building Feathercoin instead of bitcoin but it should be the same for both.  Also I’m presuming that you already installed the build-essentials like gcc.  If not at a minium you should have done:

sudo dnf install automake gcc-c++ openssl-devel gcc make

Go to the folder or make a new folder like bitcoin then

git clone https://github.com/FeatherCoin/Feathercoin.git

Now we need to get and compile Berkeley DB 4.8

wget http://download.oracle.com/berkeley-db/db-4.8.30.tar.gz
tar -xvzf db-4.8.30.tar.gz
Go to /build_unix/
../dist/configure --prefix=/usr/local --enable-cxx
(as root) make install

Install the boost C++ files, and the qrencode, And if you want the GUI also add the protobuf:

sudo dnf install boost-devel qrencode-devel protobuf-devel

Now you can run the standard build process thats listed under doc/build-linux.md


Jun 12

Moving AD groups from one domain to another using ldifde

This is something that I almost never do so I’m documenting it here to avoid future mistakes.

We are migrating between domains and while we considered ADMT, there was too much clutter and our AD structure changed.  So a piece meal approach was decided on.  Hint to future self, don’t do this again, use ADMT!

I’m transferring groups but this has proved problematic.  Two main reasons for this, the AD structure is different and the Domain is different.

Big lesson first up is that you need to exclude many SAM entry’s using the -o modifier.  My command for groups striped out everything but member.  However all the member mappings changed so I excluded them as well.  I ended up with:

ldifde -f exportfile.ldf -s <serverName> -d “OU=Tomcat,OU=Security Groups,OU=IBM,DC=<domain>,DC=<domain>,DC=local” -r “(objectCategory=CN=group,CN=Schema,CN=Configuration,DC=<domain>,DC=<domain>,DC=local)” -o “badPasswordTime,badPwdCount,lastLogoff,lastLogon,logonCount,memberOf,objectGUID,objectSid,primaryGroupID,pwdLastSet,sAMAccountType,member”

Now to bring that over.  I can convert using the -c option.  In my case the OU’s lined up on the other side so the command was:

ldifde -i -f exportfile.ldf -s <serverName> -k -v -c “DC=<domain>,DC=<domain>,DC=local” “DC=<domain>,DC=<domain>,DC=org”

I have all the groups but no membership.  Use ADMT next time.

Errors that you can get with any of the SAM accounts is discussed here: https://support.microsoft.com/en-us/kb/276382


May 20

Tomcat 8 redirect and force SSL

Edit the tomcat8/conf/server.xml and add the following for 80, and another for 8080 if need be.

<Connector port=”80″ protocol=”HTTP/1.1″
redirectPort=”443″ />

Now Edit the tomcat8/conf/web.xml and at the bottom just above </web-app> put in the following and changing Entire Application to your application in webapps.

<!– SSL settings. only allow HTTPS access to MY APPLICATION –>
<web-resource-name>Entire Application</web-resource-name>
<!– auth-constraint goes here if you requre authentication –>

Now restart the tomcat service.

May 19

Java 8, tomcat 8, SSL setup from pfx, using 443

This took me a day to setup on a new CentOS Amazon image. To be honest I’d never configured SSL for tomcat before, and this was the first time that I’d used tomcat8. So I just want to go over the steps I had to do so I’ll remember all of the tweeks needed.

Configuring SSL was more painful than I expected. First issue was that I had to break up the Microsoft IIS formated certificate I had. Fortunatly that I’ve done before. From novell

First create a new folder for all of this.
Type: mkdir cert Type: cd cert
Now get the Intermediate and root certificates from your CA place them in the folder.
Get the .pfx certificate and put it in the folder.

To export the private key without a passphrase or password.
Type: openssl pkcs12 -in filename.pfx -nocerts -nodes -out key.pem

To Generate a public version of the private RSAkey
Type: openssl rsa -in key.pem -out server.key

To export the Certificate
Type: openssl pkcs12 -in filename.pfx -clcerts -nokeys -out cert.pem

The directory will now have a file cert.pem and a key.pem

Now from apache.org

* key.pem – your certificate’s private key
* cert.pem – your certificate
* domainIntermediate.crt – Organization Validation intermediate
* inter.crt – the intermediate CA that signed your certificate
* root.crt – the root CA that signed the intermediate CA

First, concatenate the CA certs, make sure the intermediate CA goes first:
Note on this – For the chain ON A NORMAL LOAD BALANCER, it’s intermediate first then domain Intermediate then the root, BUT if you want a unified cert like we are doing here the order is different, it would be domain Intermediate, then CA Intermediate, then the CA Root.  Makes no sense to me but for Comodo it is so.

$ cat domainIntermediate.crt inter.crt root.crt > chain.crt

Next, export the pkcs12 file:

$ openssl pkcs12 -export -chain -inkey key.pem -in cert.pem\
-name “server” -CAfile chain.crt -out server.p12

When prompt for export password, enter something and don’t leave it empty.

Now, use keytool to verify:

$ keytool -list -v -storetype pkcs12 -keystore server.p12

Enter the export password for the keystore password. Then you should see
a line like this from the output:

Certificate chain length: 3

Tomcat8 should now be able to use that server.p12 file as it’s keystore.
Move the server.p12 to the tomcat home directory which is /usr/share/tomcat8/
Make sure tomcat is the owner, Type: chmod tomcat:tomcat server.p12
This server needs to use 443 instead of 8443. To do that we need to tweek java permissions.
I used the guide at confluence but used the 5th option:

If using Linux 2.6.24 or later, you can set up a file capability on the java executable, to give elevated privileges to allow opening privileged ports only, and no other superuser privileges:
# setcap cap_net_bind_service+ep /path/to/bin/java
After setting this you may notice errors when starting Java like this, for example:
$ java -version
/path/to/bin/java: error while loading shared libraries: libjli.so: cannot open shared object file: No such file or directory
This means that the library is being imported from a dynamic path, and not in the trusted ld.so path. See http://bugs.sun.com/view_bug.do?bug_id=7157699 for details. To fix this, you need to locate the library, and add its path to the ld.so configuration. Note that the below is an example, and this may differ depending on Linux distribution. Replace JAVA_HOME with the correct location:
$ find JAVA_HOME -name ‘libjli.so’

# echo “JAVA_HOME/lib/amd64/jli” > /etc/ld.so.conf.d/java-libjli.conf
# ldconfig -v
After setting this all up, you need to make sure that Confluence only starts java with the direct binary path, and not via a symbolic link, otherwise the capability will not be picked up.
Setting this up means that any user can open privileged ports using Java, which may or may not be acceptable for you

At this point I usually switch user to tomcat. to do that edit /etc/passwd and change tomcat user to use /bin/bash
then as root su tomcat

We need to edit /etc/tomcat8/server.xml
Add a new connector like this:
port=”443″ maxThreads=”200″
scheme=”https” secure=”true” SSLEnabled=”true”
keystoreFile=”${user.home}/server.p12″ keystoreType=”PKCS12″ keystorePass=”changeit”
clientAuth=”false” sslProtocol=”TLS”/>

Also in my case the application was to live on the root so to do that find the host section and add Context like so:

<Host appBase=”webapps” autoDeploy=”true” name=”localhost” unpackWARs=”true”>
<Context docBase=”/var/lib/tomcat8/webapps/YourAppName” path=”” reloadable=”true” />

exit out of the tomcat account change it back to nologin then restart tomcat. Easy right?

Older posts «

Close Bitnami banner