BINDER: Many companies have already thrown in their support to the proposed Broadband Interconnection for National Development and Emergency Response. This might sound like the National Broadband Network (NBN), but it is not. NBN is technically just a Value Added Network (VAN), but it does not have Value Added Services (VAS). As I proposed it, BINDER will be not only be a VAN, it will be the backbone upon which an infinite number of VAS could be attached to it, so that these could function and operate.
VAN-VAS ARCHITECTURE: As it was supposed to be (but it did not happen), the government was supposed to build the VANs, and that would have enabled the private sector to attach their own VAS, without spending too much on their own infrastructure. This would have lowered the cost of services, because the private sector would not have too much investment to recover. This is similar to what happened in the case of the network of roads and highways that were built by the government. With these in place, the private sector was able to provide the land transportation services that are now running on these networks.
PRIVATE TOLL WAYS: We pay our taxes and in doing that, we expect the government to build infrastructure like roads and highways. As it happened however, the government was not able to build the superhighways as it was supposed to do, and that is why private companies had to build and operate the private toll ways, and that is why we have to pay money as we pass through. In a way, this is like double taxation, but that is what is happening, because what was supposed to happen did not happen.
INFORMATION SUPERHIGHWAYS: Since the government was not able to build the national broadband networks as it was supposed to do, the bigger companies decided to build their own information superhighways. As it was supposed to be, these private information superhighways were supposed to operate only as VANs, because the original intention was to give the smaller companies the opportunity to provide the VAS components. As it happened, the bigger companies monopolized the business by running their own VAS components in their own VANs, thus leaving out the smaller companies.
INTERNET SERVICE PROVIDERS (ISPs): As it originally happened, the smaller companies were buying bandwidth from the VANs of the bigger companies, so that they could provide services to the general public as the ISPs as we knew them before. As it eventually happened however, the bigger companies started providing internet services on their own, thus breaking the ethical rule that the producers of goods and services should not compete with their own wholesalers and retailers. As it happened, the ISPs went out of business, leading to the monopoly of the VAN-VAS businesses by the bigger companies.
PATCHWORK CAN WORK: Since the bigger companies have already built their own VANs (they all have their own separate networks), it is now possible to build a nationwide (as in national, if you get the drift) broadband based interconnection, by simply patching (as in interconnecting them) together, to the extent that they would allow it. For many years now, I always find myself in the company of government planners who would always want to build a “brand new” national broadband network, instead of a patchwork. They would be in the right place if they were working for a superpower, but there is really nothing wrong if a developing country would choose instead to patch together what is already there.
FOUR MAJOR TRENDS IN COMPUTERIZATION: The four major trends in computerization are (1) cloud computing, (2) big data, (3) server virtualization and (4) storage scalability. As it used to be, private companies and government agencies had to invest a lot of money in building their own data centers, thus incurring too much costs in servers, storage and facilities management. As it is now however, they have the option of getting (acquiring) these services from “the cloud” (meaning from remote offsite locations). As it used to be, it was very difficult to manage and mine huge data assets in an economical and efficient manner. As it is now however, there are newer technologies to manage and mine the “big data” robustly, no matter how big and diverse it is. As it used to be, hundreds of servers were needed to run big data centers. As it is now however, servers could already be “virtualized”, thus needing lesser servers and smaller space. As it used to be, servers and storage devices were “married” as one. As it is now however, these are now “divorced” and the storage spaces are now scalable.
THE BETTER SIDE OF BUSINESS: I worked for San Miguel Corporation (SMC) when its corporate motto was still “Profit with Honor”. The motto is long gone, but SMC along with many other companies are now practising “Corporate Social Responsibility” (CSR for short). As it used to be, corporate philanthropy was a one way street, meaning that corporations gave money to worthy causes without expecting anything back, except perhaps a good image and goodwill. As it is now however, corporations have the option to get back something from their donations, in the form of tax credits that they could deduct from their net taxable incomes. It does not really matter whether corporations would give donations in exchange for something or for nothing, as long as they give to worthy causes.
SYSTEMS INTEGRATOR AND PROJECT MANAGER: Many companies in the Information and Communications Technology (ICT) sector have their own CSR programs in pursuit of their own corporate objectives. Since I know many of the officers from these companies, I volunteered to become the “Systems Integrator” (SI) and “Project Manager” (PM) of a shared network that would be built from a patchwork of donated hardware, software and services, combining whatever surpluses they could donate. This is now the BINDER project.
For feedback, email firstname.lastname@example.org or text +639083159262
By Allysa Faye Greganda
It’s time for tech companies to fund a team that constantly checks the strength of everything that is in the internet. These days, we depend on the Internet for half of our lives. Weenjoy its benefits, but surely no one wants to be victimized right within the tip of his own fingers.
Out of the 7 billion people alive as of this moment, 2 billion are actively using the internet. You are actually one of those 2 billion people who risk personal information, transaction records and confidential details with an average of 30,000 hacking attempts every day. The figures do speak of something. But the question is, how sure are you that you’re not a part of the 30,000 incidents per day?
We are not new to the tactics of hacking—the stealing of data that can be possibly used to access bank accounts and whatnot. There is a long history of serious hacking incidentsthat have caused companies to shut down and lose millions of consumers. Take Mt. Gox as the perfect example.This is the world’s largest bitcoin exchange company, and it has been hacked of $450 million some months ago.
To hack a website or a person’s account, it usually requires direct contact and the person unknowingly giving permission to inject a virus that will get through the encryptions. Just weeks ago, the internet community was stunned when three engineers discovered the Heartbleed bug that’s been in existence for two years already. No programmer has ever thought that a bug, a simple programming error, can be the biggest flaw in the history of the internet.
World Weak Web
Last April 3, 2014, engineers from Codenemicon and Google found out about this serious hole in the internet. For the non-techie’s information, the internet is a virtual world. There are a lot of places to visit there, and these are called websites. Each website has its TLS (transport layer security protocols) to filter information, and to keep and protect them. Using this method will keep our personal details from being stolen by hackers. But why is it vulnerable to be attacked if TLS is supposed to secure information?
The hole in the websites came from a software called OpenSSL. This software as well is an implementation of the TLS. In other words, OpenSSL is like an operating system used by website hosts. This software is created by a group of programmers and is offered online for free. This is the only available software for TLS. That is why 66% of websites according to Netcraft’s April 2014 Web Server Survey uses OpenSSL. And this 66% as well has been using the latest version of OpenSSL, which is recently discovered to contain a vulnerable code for a span of two years, since its release last 2011.
Because it is the only available software, even less-visited websites use this, including government websites in the Philippines. Yet, no oneknew if someone noticed the bug earlier. Someone could have, but the person never reported it because he might have taken advantage of the weakness. There are many rumors about the possible abuses made before the bug was discovered, yet those werenot proven.
Debugging The Problem
The good news is, 33% of websites in a span of two years were safe from the vulnerability especially the most-accessed social networking sites like Facebook, Twitter, Pinterest, Google, Yahoo and others . The other good thing is, Open SSL was able to release a solution, a patch file just a week after the report to cure the problem. It was announced that the internet community is safer, but not totally until everyone has updated their TLS. The least a normal user can do is to constantly change passwords as frequent as possible. Users should always countercheck if the websites they are trying to access are safe enough to engage with. We should also keep updated if any new mode of attack is spreading.
This incident should be taken up seriously. Hacking is illegal and is considered a serious crime. Weshould not be relaxed about it because a number of victims were caught unguarded. Hackers break into systems forpersonal, and often criminalreasons.Everyone should get involved. Google and Facebook are just fortunate enough to hire skilled programmers to defend its system from intruders. But how about the others?
How about the safety of the entire community? Since the internet is a virtual realm, then just like in the real world, there must be a team to ensure the security of the netizens. There is always a tendency to commit human error. However, two or more heads are better than one, as they say.
(The author is a graduating student of AB Communication in the University of Perpetual Help System Laguna. She is currently working as an intern for OpinYon.)