There’s always something to howl about

Product (category) idea: Antoinette the anticipator.

I first thought of the idea of an anticipator as hardware, I kid you not. The early 1980s? Software was dear in those days, but early computer-on-a-chip chips were cheap and abundant. There still would have been a software component to an anticipator, of course, but not much.

Here’s what I thought about then: Anything that could be monitored by signal processing — as, for example, the communication between a micro-computer and its peripheral devices — could have an anticipator in-line, monitoring all the signal traffic back and fourth. By maintaining a probabilistic database of past events, the anticipator could, over time, evolve strategies for anticipating resources likely to be called for in the near future, and, using otherwise dead time on the computer’s data bus, cache that data in advance, eliminating time lost on fetch requests made in real time.

Wow! How kludgey our world used to be! In the bad old days, there were pre-fetch routines built into operating systems, but they were a brute-force solution to a vast array of very small, fussy problems. An anticipator would strive to be optimally efficient and mission critical by dealing only with the specific data most likely to be requested.

An example? If a font required for a document is not stored on your printer, the printer must fetch the outline data from your hard disk. It’s a small job, on its own, but you could maximize your productivity from the printer if those fetch calls in real-time were ameliorated by intelligent pre-fetching. The anticipator could both maintain the most-often used outlines in the printer’s memory as well as anticipating exceptions to the everyday rules — for example, by keeping the boss’s favorite Christmas font on the printer from Thanksgiving through Christmas. That implies real secretarial smarts, but it’s simply probabilistic database mining being perfected over time.

So what about now?

Antoinette the anticipator harkens back to Heidi and Sarah, and to Constance, which I haven’t gotten to yet.

Imagine an anticipator function in Sarah that, when Sarah figures out that you are going to be late for a meeting, sends out all the appropriate notices, all on her own. What if Sarah knew how to gather all the information necessary to initiate a new for-pay job in your systems — creating the accounts, copying in the CRM data, issuing the work orders, generating the PERT chart — whatever. In my world, opening escrow for a new home-sale transaction entails a lot of (now mostly virtual) paperwork and a host of arcane details. I want for Sarah (or someone!) to do that work without my involvement, except for oversight and quality control.

How would she do this? By watching you and generating, over time, probabilistic rules for highly-repetitive functions. Let her talk to you and she can learn that much faster by asking you what you are doing. Data does not know what it is, and data-processing software does not know what the data it processes is. But signals are real — both digital signals and real world events — and kinds and qualities and quantities and frequencies of signals can be collected, measured and analyzed. In the example above we had one anticipator watching one printer connection, but Sarah watches everything. She can not only cache fonts better than anything I imagined in the 80′s, she can watch the toner levels and order consumables on a just-in-time basis.

Any data that can be collected can be analyzed, pattern-matched and acted upon. This is how your dog’s brain works, in essence. But software can be so much smarter and more productive than a dog. A long time ago, I wrote about spell-checking as a crowd-sourceable phenomenon. I was making jokes, but that turned out to be humor-for-one. But imagine if the text editor in your operating system used an anticipator as part of its spell-checking and auto-correction. Over time, it could adapt itself to you uniquely, correcting virtually all of your errors in real time. Signals are signals, and most of what you do, by now, consists of generating measurable signals. The kind of software I am describing could be incredibly productive, in your own unique life, in just a few weeks of working with you.

Now: Who wants to hear about Constance, a software idea that ties all of these ideas and some others together?

Our story so far: Lately, I have been tap-dancing around an idea for a new kind of computer-user operating paradigm. I haven’t explicated the central thesis yet, but it should be easy enough to infer from the essays I’ve written so far:

Related posts:
  • Buzzing about Google Buzz
  • Refurber brings social networking to home remodeling
  • Free IDX from Realtor.Com!


    4 Comments so far

    1. Don Reedy January 3rd, 2012 3:58 pm

      For Carmen….I’m all ears. You know, they don’t dance like Carmen no more….

    2. Greg Swann January 3rd, 2012 4:53 pm

      Sorry, Carmen is becoming Constance. More about her soon.

    3. Teri Lussier January 17th, 2012 5:28 pm

      Late to the party, catching up on reading.

      So. I don’t have to plug this with info, she just intuitively learns from me? Training a real human is time and money intensive, whether I’m training an admin assist or I’m learning how to use a piece of equip. But this would learn from me? But but but… That means all my bad habits would show up as well?? eek :)

    4. [...] etc. It is definitely true that understanding how a dog’s brain functions is valuable — it’s a path to better software. But in the end, the purpose of “news” like this is not to teach you something new but [...]