All posts by michaelharrington2

The Worlds best memory – IMPROVED!

IBM has been developing a memory known as “Race Track” memory (TM by IBM). It uses Spintronics (Quantum Effects), Magnetic Fields, and the ability to move memory up and down a string of atoms as the means to store data.

This is one of two amazing memory types in contention for the future of Memory. The other being Memristor memory.

The amazing thing is that my technology can actually increase the data density of Race Track Memory.

The trick is built into Pascals Triangle and to achieve this data density increase we merely shorten one (or more) of the Strings in the memory system. Density could be increased quite easily to store in permament format the entire Operating System and all possible Drivers.

Change what spins

CD discs spin because it is easier to spin the disc than it is to spin the reader and maintain balance, to set up the scanners to read it all. Indeed at a consumer level CD’s and DVD’s will be hard to replace.

However at a Commercial Level things can, and should, change! A CD Rom in theory can be boosted by three times or more in data storage when it is held stationary per methods described in my patent. DVD’s as well can experience significant upgrades.

However this technology is also looking at Magnetic Drives in a new manner. My technology could allow Magnetic Memory Drives to increase data density by as much as 10%!

The Patent Pending technology covers many different possibilities and has FOUR underlying major claims (Or subsets of the Invention) and just one of them allows this increase!

3D Printed Storage

It is amazing how fast 3D Printing is advancing. Current technologies are peaking at 10x10x100ùm with 1,000,000 colors. This is remarkable indeed.

As we get closer to 1×1ùm printing and full color the only real consideration will become the price of colored plastics. Plastics are so cheap that one could concievably print out 10 terabytes for $2 (and cost of the Printer, but the Printer would let you print as much as you need to).

This memory type of course would require a whole new reading system but his will not dramatically increase the costs when one is printing thousands of terabytes of data. Light is the fastest of readable memories and plastics have very long shelf lives.

The first storage method

There are numerous methods my Patent Pending technology can be used to store data. The first method which is marketable almost immeadiately is via Silicon Wafers.

With Silicon Wafers we can produce memory at a potential of $20 a terabyte (for production and storage). The memory needs to be kept in clean room conditions however.

This first version however is immeadiately EMP Proof, can store data for decades, and can be custom encrypted to prevent thieves from reading it. The Encryption, if done correctly, would quite literally be undecipherable without the key and the methods of encryption.

This initial system is going to probably read slow, it is suited for special backups where losing data is NOT an option. After working security in a US Bank Corp. Check and ATM Processing Facility I can tell you they had three backups for their data. My system would be cheaper, easier to store, and be longer lasting. These are important considerations!

The price per terabyte is expected to drop ocer time as improvements to microchip etching become more perfected. Current plans work with a 50nm size range, a 25nm size range would quadruple the storage for the same cost. So yes it is quite possible in 5 years or so to pay $5 a terabyte for long term storage via Silicon Wafer. This represents production and storage prices as an estimate however, some profits will apply!

The Physical Hard Drive

The Physical Hard Drive is a patent-pending system to hold data. Technology has advanced to where this is a functional data storage method, and data storage augmentation method. I developed it last year and seem to be way out in front of the competition. It is based on the simple idea of lines, dots, shapes, irregular heights, holes, and variable sized drives to increase potential memory past a tb per inch2. See the Pending Patent in PDF at http://www.MichaelHarrington.com/Patent.pdf

An Inch2 has 645,160,000,000,000nm2. I propose an initial standard of 50x50nm “spots”, which avoids the need of complicated lithography until it can be mastered; however, spots are not the only method, just a unit of measurement for now.

There exists several ways to encode data today, from magnetic and optical drives, to the old punchcards. My method utilizes physical lines, dots, and even shapes to achieve a higher potential data density on a given amount of material and can even include colors, textures, elevation differences, empty spaces, holes, and different sizes (and subsections) of drives to vastly increase data density.

While some of those technologies, colors for instance, may not currently be cost effective or may have limited data density for data storage, with advances in technology these will become more cost effective and dense in data holdings. Light, for instance, will eventually be able to produce near full color at 1ùm. This would mean essentially 23 bits of color per ùm2.

Currently, full color appears to be possible only at 100um, though I suspect that 10um shall happen in the next two years. A 10x10x100um model with only 1,000,000 colors exists currently. This is already a considerable increase over what was available in the past. Given the technology level, this with color is capable of 15mb per inch2. At 1ùm this becomes 1.5gb per inch2, but amazingly this can be printed on plastic or a composite in the future; meaning you can make a “deck of 54 two sided cards” which is almost the same size as current hard drives and would hold 1tb for pennies on the dollar.

Since an inch2 has 645,160,000,000,000nm2 and I am proposing a 50nm2 standard, this leaves 258,064,000,000 locations on one side of an inch2 at maximum. This corresponds to 30 gigabytes per inch2 (on one side).

Through a new sub-school of math I call “Advanced Combinations” it becomes possible to use 3D structures. Recent work by Dr. Onur Tokel has made use of infrared lasers able to make structures, lines, tunnels, and dots INSIDE silicon wafers without damaging the surface of the silicon wafer. In theory, by using multiple interior levels it should be possible to get as high as triple the total normal data storage.

However, even without using such a method, we can increase the yield via a variety of methods. Alignments of lines is one potential method, where subsections will have their own alignment compared to other subsections. Additionally, if we have interruptible rays from a central source (aka breakable lines) and multiple central sources (with the locations of these central sources being anywhere) we gain a significant means to encode vast sums of data.

One advancement is the idea of the hole. Ten holes in a 100 bit section costs 10 bits but adds 26 bits. This represents a 16% increase in any existing memory system. Pascal’s Triangle provides the key to understanding this, as the locations of the holes are essential in how the data is encoded. Structural integrity should be the only factor of concern. This data is permanent in nature, such as if you drilled holes in a magnetic drive (thus removing the magnetism of that specific location) then it is not read/writeable, but is forever encoded. A 5-10% increase in data density should be achievable with this technique. In a terabyte of storage this would reflect between 50 to 100 gigabytes of data permanently encoded. This should suffice for a full operating system, all current drivers, and a number of common programs (browsers, map programs, music/movie players, etc.) as desired at time of manufacture.

We can also significantly increase data by just making 4 levels of depth per side of a given wafer. Add to this the new infrared technology where we can place and read data from inside a wafer and could, in theory, triple to quadruple the data storage. It is double with minimal depth differences as-is.

At this juncture, using only proven methods, I believe that 132 gigabytes is possible per inch2 and that as much as 800 gigabytes in theory using methods not yet fully modeled. Advances in technology seem to point to ever rising amounts of data we can store. I suspect the final level will exceed 2tb an inch2.

This technology will initially be useful (due to cleanroom requirements and read/write equipment costs) only for large companies with long-term data backup requirements. This includes the Government (NSA, especially, it would seem), colleges/universities, banks and financial institutions, and very large corporations.

The advantages are huge data storage density and potentially low cost (silicon is far more common than rare earth elements used in magnetic drives), and a safe format for storing data. How does this compare with existing hardware? At $7.60 per area of 7 square inches on a 3inch diameter wafer we have a cost of $9 a terabyte, at its highest, to as low as $1 per a terabyte (source: University Wafer)… excluding the read/write hardware, cleanroom, wafer storage, and other incidental costs.

For potential investors or buyers:
This is an applicable technology. The error rate will be low by using a 50ùm standard. At near 1tb per wafer of 3inch diameter a lot of data can be stored in carry boxes and the cost cannot be beat. Yes, there are other costs involved, but these combined costs should still keep the total costs to under $20 a terabyte. These wafers are also going to be EMP proof, they will have exceedingly long shelf lives, require no electricity when properly stored, and advances in math & manufacturing will continue to make this the most cost effective data storage system in the world.

As an added bonus, my technology allows for increasing current data storage methods by as high as 10%.

Much of the potential is tied up in math models. I will also accept a 3 year consult and assist contract, to help bring this technology to full capabilities. A team of a couple of software developers and a mathematician should make developing the models very possible without undue difficulties.