Everything you believe about Information Technology is probably wrong.Here's why: Information Technology (IT) has undergone tumultuous changes ever since the January, 1975 publication of Popular Electronics, which displayed the ALTAIR 8800 on its front cover. Hobbyists in particular were ecstatic at the prospect of actually owning a programmable computer, albeit one that had very serious limitations. Upon seeing the magazine, Paul Allen and Bill Gates, who later founded Microsoft, sprang into action, quickly writing a version of BASIC for the machine. Their efforts helped launch the personal computer (PC) revolution. The next year Steve Jobs and Steve Wozniak introduced the Apple I, followed by the Apple II in April, 1977. Despite the enthusiasm among the homebrew computer community, mainstream businesses did not take these “toy” computers seriously. The general population also wondered why anyone would ever need a PC. It didn't help that there were few standards, and that the current applications were limited largely to games, checkbook balancing, recipes and the like. It was not until IBM introduced the IBM PC on August 12, 1981, with Microsoft supplying the disk operating system (MS-DOS), that PCs gained credibility. Even so, established businesses that had significant capital invested in their legacy systems were skeptical and reluctant to change. Cheaper, IBM PC-compatible machines soon appeared, along with more languages and applications, such as electronic spreadsheets, word processors and database managers. At last, tech-savvy power users could program their own machines without depending upon services provided by established IT companies, or be subject to the priorities of their employer's data processing department. Users could manipulate critical information sooner and didn't have to wait for others to provide it to them. The Apple Macintosh introduced new features like the graphical user interface (GUI) and the mouse, both ideas “borrowed” from the Xerox Palo Alto Research Center (PARC). Users could interface with their machines in a more intuitive way, rather than working with a character-based system. It's ironic that Xerox could had been a pioneer in the PC marketplace had its senior executives abandoned their archaic, paper-based, copier mentality in favor of the newer, digital age thinking. Evidently, what they believed about the future of information technology was completely wrong. Meanwhile, Microsoft, like Apple, introduced a GUI operating system for the IBM PC. They called it Windows. This newer operating system drove demand for higher resolution color monitors and a what-you-see-is-what-you-get (WYSIWYG) expectation on the part of users. Laser printers could render on paper what was seen on the monitor. Computer Aided Design (CAD) programs and more realistic games would soon follow. At the time, dial-up bulletin boards allowed users with modems to share data and access online services. Bulletin boards were supplanted by the Internet, which allowed anyone to browse web pages anywhere in the world. Though exciting, what opportunities were there to make money on the new world wide web? Again, some questioned this newest innovation. To use the Internet for purchases, you needed secure monetary transactions. Credit cards might work, but browsers and web servers also needed to be secure. As web security improved, online payments could be made via the new and popular, third-party service, PayPal, which didn't directly depend upon disclosing your credit card number with each transaction. With a rapidly expanding Internet, it was very difficult to find your intended search results. Various companies had their own search engines, but the user still had to manually sift through the returned links to find their sought-after information. Enter Larry Page and Sergei Brin with the Google search engine. Theirs was technically superior because it delivered relevant content and responded instantly. They pitched their search engine to Yahoo, but were dismayed when their offer was rebuffed. Yahoo said there's no money to be made in search; they were not interested. Undaunted, Larry and Sergei demonstrated Google to venture capitalist Andy Bechtolsheim and accepted his investment check of $100,000. Google eventually proved that money could be made from search if appropriate advertisements were placed alongside returned content. Today Google is one of the most valuable companies on the planet. In retrospect, Yahoo's rejection of Google's search engine seems to be a blunder of colossal proportions, particularly for a company that was an early Internet leader. It's not that the truth isn't out there; it's simply that it often goes unrecognized. As human beings, we always seem to be lulled in a normalcy bias, and become risk-averse to those ideas that are outside of our experience or comfort zone. It's only when reality finally reasserts itself that we realize that we were wrong. Of course, by then it's a bit too late. Consider some current technologies that are disappearing due to innovation. The land line telephone is all but gone in most homes in favor of mobile cell phones. Brick and mortar department stores and shopping malls are vanishing, unable to win against online retailers like Amazon. Big box stores like Home Depot are driving out local hardware stores. Mom and pop delis and the smaller grocery stores simply cannot compete with the likes of WalMart Supercenters and their dominating pricing strategy. It's clear that disruptive innovations change everything and information technology is in the obvious forefront. It's been said that in the next decade we will need millions of people coding software. That statement seems as logical as the 1940's prediction that within ten years (due to the rapid growth of the telephone system) one half of the female population in the United States will be employed as switchboard operators. Plainly, that did not happen. In the succeeding years, advanced electronics had changed everything within the telephone industry. A disruptive innovation, Project Notai will revolutionize the way future software is developed. Why? Despite the impressive advances in Artificial Intelligence (AI), information technology is still stuck in a kind of gravity well. Amazing as AI may appear, it's doubtful that a neural network or other advanced analytics could ever write a useful line of computer code. By contrast, Project Notai is able to solve problems by writing programs, achieving the kind of escape velocity that allows it to transport users to worlds unknown.
Consider all the ways that a graph database like Neo4j and its Cypher Query Language are superior to a traditional relational database and its Structured Query Language.
Project Notai will use the Neo4j graph database and the Cypher Query Language in its expert system rather than using a traditional relational database and Structured Query Language. Neo4j is ACID-compliant, meaning it has Atomicity, Consistency, Isolation and Durability.
We believe it is incumbent upon all of us to repair our world, to leave it a better place than we found it, and to ensure a brighter future for our children to inherit. We maintain that it is better to create than to destroy―and finer still to restore than to abandon.
Development of Project Notai began in Suite 228 at SUNY Fredonia's Technology Incubator in Dunkirk, New York, on June 1, 2018, and is continuing. Feel free to drop in at any time.
Cython contains a superset of all of the features of Python, but having the performance of C. Cython is compiled, so applications can be distributed without revealing its underlying source code, protecting intellectual property. Cython is derived from Pyrex.
Notai will use the InterPlanetary FileSystem (IPFS), a peer-to-peer hypermedia protocol to make the web faster, safer and more open. Think distributed. Think cryptographic. Just think. On April 29, 2017, the Turkish government blocked all access to the online encyclopedia, Wikipedia, because it referred to Turkey as a state sponsor of terrorism. In a dramatic move, activists downloaded the Turkish version of Wikipedia onto the IPFS platform. This decentralized format makes censorship impossible because the data is no longer located on a single server identified by an IP address. Instead, content is scattered across multiple anonymous computers; even the file names themselves are encrypted. The IPFS format also makes Distributed Denial of Service (DDoS) attacks virtually impossible. Freedom of the press, it seemed, had won a small victory through the use of advanced technology.
Disruptive innovations change everything and Information Technology is clearly at the forefront. My name is Timothy O'Malley, owner of MarsBux Technologies and the creator of Project Notai, a new kind of Information Technology. Because Notai actually writes code to find optimal solutions, it will forever change the way software is developed. The increased productivity and cost savings provided by Notai offers very real competitive advantages. Is this AI? No―it's Notai.
Although the initial human footprint on the planet Mars would wait until
the 2030's, our story begins a few decades earlier. After the turn of the 21st
Century, there was a young dreamer named Elon Musk. Emigrating from his native South Africa,
this ambitious lad made his fortune early in life by writing computer software and selling
the companies he founded for massive amounts of money. Yet, despite his precocious success
and fame, his passion―his burning desire, really―was always the exploration of
space, and in particular, the mysterious and foreboding Red Planet.
He dreamed of flying through space to that distant and austere world, witnessing its pock-marked surface, and the gargantuan volcano, Mons Olympus, named for the legendary abode of the fabled Greek gods. Yes, he visualized traveling there in a rocket ship like some hero from an Isaac Asimov novel he treasured. He even said once that he wanted to die there―but just not upon impact. But how could this, his boyhood fantasy, be realized during his adult life?
Then it struck him like the proverbial bolt from the blue. He knew what he would do―what he must do: he would start his own rocket company. He would use his windfall of capital to fund the project himself. It would be he, not some bureaucratic government, who would accomplish this extraordinary feat. He would call his company SpaceX, and come what may (and many setbacks there would be), he would never give up, he would never quit. He would persevere until that magnificent new day dawned when the human race became a multi-planetary species.
Naysayers and educators alike said it could not be done. It was reckless; it was impossible; it was but a pipe dream. But as we now sit comfortably in sprawling Elon City on that rusty fourth planet, we must never forget those who made all of this possible, visionary pioneers like Elon Musk, who against all odds, possessed a dream and would never let it go unrequited.
In 2017 we created the cryptocurrencies, MarsBux (MARS) and MarsBux2 (MARS2) as an experiment in blockchain technology. Since no Initial Coin Offering (ICO) occurred, they may be considered as utility tokens rather than investment grade coins. MARS and MARS2 cannot be bought or sold directly, but are traded exclusively on CoinExchange.io. Scrypt-style ASIC devices can also mine these two cryptocurrencies from a suitable pool. Neither MARS and MARS2 and should be confused with MarsCoin (also called MARS), or other similarly named currencies―they are simply incompatible. Attempting, for example, to transfer a quantity of MarsBux into a MarsCoin wallet, or vice versa, will result in irretrievably lost coins. Whatever value MARS or MARS2 may have is determined entirely by the marketplace and is subject to existing laws. We will not attempt to offer any financial advice regarding these or any other digital assets. We cannot predict their future value.
Neural networks can be trained to recognize written characters. After scanning 100,000 examples of written digits, for example, a neural network was able to associate the correct digit with about 98% accuracy. Likewise, similar systems have been developed to identify letters from their written forms.But what happens when a matrix of random dots, like the noise of an unconnected TV, or even a random photograph, is presented to the system? Surprisingly, the program will still resolve the image to some digit or letter, as if the information were somehow encoded within it. Moreover, any number of random patterns can map to the same character.A new method for cryptography might be as follows: ► Generate many random patterns that correspond to specific characters in a neural network. ► Send a sequence of random patterns as the characters of a message, but never repeat a pattern. ► Receive the patterns and use the original neural network to decrypt the message.Since the entire message appears to be random, and since no image is ever repeated, no discernable encryption method should be detected. If the neural network were sufficiently complex, it's unlikely that the message could be decoded. Code breakers would then focus on recreating the exact neural network. Good luck with that!