Category Archives: Technology and the Law

Reaction Paper: “Regulations and Cyberspace”

(Reaction from the A/V presentation at MMS 201, Arellano University School of Law, Taft Ave. cor. Menlo St. Pasay City, Philippines, played on 03 September 2015)

In the video was Prof. Lawrence Lessig of the Stanford Law School discussing about regulations and how it applies in cyberspace.

It is quite amazing how simple internet is, and at the same time so complicated. As one John Gilmore puts it: “the net interprets censorship as damage and routes around it”. This brought the idea that the internet is “unregulable”. The idea is that the internet caters cyberspace, a futuristic concept which cannot be controlled by any government or sovereignty. But this also brings us to the irony that: if the cyberspace cannot be regulated, then why so many people worry about its regulation?

Prof. Lessig illustrated how things may be regulated by the following modalities:

  • Law
  • Norms of Society
  • Market
  • Architecture

Concept of Regulation

The discussion made by Prof. Lessig is so simple and easy to understand. In the motorist’s language a law may regulate motor vehicle speeds such as speed limit of 80 kilometers per hour. Meanwhile, the norms are those not dictated by state but might prompt us to “pull over” whenever other vehicles are doing the same or when there is a problem. On the other hand, the market regulates the use of motor vehicles by dictating the selling price of vehicles as well as the price of gasoline and diesel. The architecture, as a modality, simply set limits to performance parameters of motor vehicles such as maximum speed of 180 kilometers per hour, the dimensions the vehicle, and other conditions. Of the four, it is the law that greatly affects the other three in the aspect of regulation.

Laws are set of conducts that may be used to bend and affect norms, control the market price by imposing taxes and regulatory fees, and may dictate how the architecture of things should be. The set of conduct or rules are ex ante, meaning before the fact, and violation of which are punished ex post, or “after the fact” of commission or omission.

Applying these in the case of “no smoking” regulation for example: the law imposes the rule that only those 18 years and above may be allowed to purchase and smoke cigarettes. The law now indirectly affects the norms by a propaganda that “smoking kills”. People smoke depending on the market price, which is affected by the way the government imposes duties and taxes on such products. As to the architecture, a law may provide for certain nicotine content limits. That way, smokers are less likely to be addicted or may eventually reject smoking.

Law dictating to architecture is the most effective way of regulating things

Interactions with the other two (norms and market) are a lot of easier to understand. Prof. Lessig’s approach focusing the on architecture is laudable. It is in this topic the author came to understand better of the internet and the cyberspace.

In controlling architecture, one may impose building codes and zoning ordinances that would provide for accessible buildings, wider roads and sidewalks, and other similar stuffs. One can think of an “accessibility law” like for instance our very own Batas Pambansa (B.P.) Blg. 344 as amended. Said law provides for a mandatory design and architecture of accessibility for Persons with Disabilities (PWD). That means the design of roads, sidewalks, all the way to the parking area, to establishments and within, including amenities such as comfort rooms, must be in such a manner not to restrict PWDs freedom of access. This is an example of a law telling how architecture should be.

In a nutshell, architecture is a regulator on the ability of persons or things to participate in simultaneous interactions. If the law can dictate how architecture would be, the former can also practically and effectively regulate anything within the latter’s bounds.

Understanding how cyberspace (as the internet’s architecture) is regulated

If the cyberspace represents the architecture of the internet, it seems like the cyberspace, and ultimately the internet, may be regulated after all. But to regulate cyberspace is to understand first the architecture where it was built.

The cyberspace works on a Transmission Control Protocol/Internet Protocol (TCP/IP) which contains set of rules on how to transfer packets of data. As previously discussed in the author’s second reaction paper titled: “Implications of Structuring Communication Technology”, the process can be likened to a “water cycle”. Water on earth evaporates as vapor, goes up to the clouds, condenses into rain which precipitates back to earth. In the same manner, an email in one endpoint is first broken down into packets, then transferred into the “cloud”, then re-assembled back into a readable email in the other endpoint.

Relative Anonymity vs. Relative Identity

With such kind of architecture, it would be difficult to answer questions like: who sends or receives emails? What are its contents? Where it came from and where is its destination? The strength of the system is also its weakness. This manner on how people would treat data in the internet will affect the norms and other market aspects. In the matter of law enforcement, we could hardly locate one internet user because the TCP/IP is based only on a logical location and not a geographical location. To resolve the issue, at least three (3) different technologies were introduced: the cookies, packet sniffing, and IP mapping.

To the author’s mind, these innovations intended to reveal certain details about endpoints in internet transactions also provides the same threat that people are worrying about. We understand that cookies are bits of information left in our PCs (and even in some mobile devices) to help a server (internet site) recall the transactions and, to a certain degree, identify the person making such transaction. But this feature may also be the means on how to collect information from us to be delivered back to an identity thief. Packet sniffing on the other hand makes one vulnerable in exposing his/her online transactions to the one obtaining the packets of data. This could be the government but this could also be anyone else just sneaking around.  The idea of IP mapping is a good idea because there is at least an instant approximation of the person’s geographical location. However, if we talk of civil liberties, on the right to be left alone, this may not be totally acceptable to some freedom fundamentalists.

With the start of these improvements, the relative anonymity now becomes relative identity, as we can now better understand who are we dealing with, what are the contents and consequences of the transactions we are making, and the proximate location where these transactions are taking place. The idea of “stateless architecture” is slowly being set aside. These changes brought about more commercial activity and the same time more control over the cyberspace. Because of this, the four modalities must also be understood as interrelated and interdependent, such that the increase in one or more results in the decrease of dominion over the rest, and vice versa.

The interplay of modalities: the law, the norms, the market, and the architecture

First users of the cyberspace focused on interactions, i.e. more on communications and mails. The norm is that there should be no advertisements. But this changed with the advent of Internet Service Provider (ISP) giants like the America Online (AOL). Commercial activity becomes prevalent. The norms slowly began accepting the fact that advertisements are necessary part of the cyberspace – which promises cost-effective means of advertisement and publication.

The increase in market activity had been acceptable for some time until the dawn of spam advertisement and spam mails. The architecture must be adjusted and expanded either to accommodate the increase in market modality or simply to suppress it. The law and the norm remained silent at some time during this battle between the architecture and the market. But later on, laws were passed in a way to control the over-activity of architecture and market and to give way for a better norm in the use of cyberspace. This makes law as an important controlling factor in the interplay of the four modalities.

The architecture, being the next powerful modality, should be the one the law is so concerned about. The author agrees, with certain reservations, with Prof. Lessig that the “code”, as being used in the architecture modality, has the following characteristics: (1) code is law; (2) code is plastic; (3) no law can beget bad code, and (4) good law can prevent bad code. The code (sometimes used as “source code”) sets the rules on how to use the cyberspace. It is the “law” in the internet. Since the code is plastic, it is not inflexible. One can alter the code to give way to one function and restrict the other functions. By introducing such alterations, the cyberspace may allow a more plastic interplay among other modalities. The law as modality is never a source of bad code. But a good law can effectively prevent or even eliminate a bad code. We mentioned good law, but how about a “bad law”? The author recalls a famous quote from Justice Oliver Wendell Holmes, Jr.:

“Great cases like hard cases make bad law. For great cases are called great, not by reason of their importance in shaping the law of the future, but because of some accident of immediate overwhelming interest which appeals to the feelings and distorts the judgment.” (Northern Securities Co. v. United States, 193 U.S. 197, 400-401)

This means that bad laws may be legislated from time to time whenever great or hard case appears, distorting some of our well settled principles of stare decisis.


The law plays a very, if not the most, important part in almost all aspects life. This goes without saying that the cyberspace is necessarily included. The fate of the information technology can always be dictated by law. It may be true that the cyberspace is out of reach by states and sovereignty, but it cannot be beyond the reach of law. And because we have seen how law can affect the modalities of regulation as it may also be affected by one another, it is imperative that our lawmakers legislate properly to maintain the balance of interests. If we can avoid the pronouncement of Justice Oliver Wendell Holmes, Jr. above, let us all help in making sure only good laws are passed.

Finally, we all cannot be legislators. But remember the interplay. The norms may not be the law, but if we all join hands in speaking of norms most favorable to the society, it can come across the eyes and ears of a legislator, who may be able to support a law for the common good. Salus populi est suprema lex. For the welfare of the people is the supreme law.

Leave a comment

Posted by on September 4, 2015 in Other Law Subjects, Technology and the Law


Tags: , ,

Reaction Paper: “Implications of Structuring Communication Technology”

(Reaction from the A/V presentation at MMS 201, Arellano University School of Law, Taft Ave. cor. Menlo St. Pasay City, Philippines, played on 27 August 2015)

In the video was Prof. Yochai Benkler of the Yale Law School discussing the implications of the constraints on the structuring the way on how we produce and exchange information.

Prof. Benkler is so much concerned in informing us the stakes involved in the communications structure and the organization of information, knowledge and cultural production. He divided the discussion into four sub-topics for us to be able to better understand his point:

  • Models of communication
  • Moment of opportunity
  • The stakes of communication structure
  • A layered view of the state of play

Different models of communication: from broadcast, to telephone, to the internet

There has been a prior discussion on these matters in the author’s reaction paper on the video by Prof. Zittrain (see Reaction Paper: “Internet Technologies”).

In addition to the comparative discussions, Prof. Benkler showed an illustration of the cycle of communication. It starts with the stimulus, to its conversion into some intelligent form of messaging, its transmission to and reception by the recipient, and then finally completing the cycle back with another stimulus. This reminded the author of the “water cycle” way back the elementary science years. Water is first evaporated in the atmosphere, condensed, then falls back to earth through precipitation to repeat the cycle once more.

After hearing the first part of the discussion, the author observed that the trending discussion of technology (when the video was taped) is from a simple system, controlled at end points, to complex distributed system that are free at the end points. Right now, vast information is being shared in the internet and it is almost so unrestricted that it overwhelms one’s capacity to absorb all of the information. Many enterprises are not so happy with this trend where they drew their profits. These enterprises are now pushing this recent trend that tends to go back to the concept of the older “broadcast” model, where there is more control to the content and thus more favorable to their economic agenda.

A moment of opportunity: “past” forward or back to the “future”

The improvements in communication technology are based on a 150-year old trend of commercialization and information production. Those who build the communication infrastructure were earlier propelled by the profits they can generate out of the opportunity. In other words, communication means business. But today, we come to realize that the economic aspect isn’t everything. The necessity of communication is slowly swallowing up economic considerations.

The author, however, disagrees with Prof. Benkler’s statement that “…the one thing that cannot be reduced is the human creativity…”. With due respect, not all persons are born with creative talents that are useful in the society. Some are born with creative talents to suppress another creative talent. That is the very reason why communication technology is unstable now. Will the big enterprises hire creative talents to ensure “open source” technology? Or will they hire creative talents to ensure their profits are secured and paid by those who can afford to pay.

Nonetheless, the statement is still true in the emergence of at least two phenomena: (1) the increasing role of non-market information producers, and (2) the emergence of large scale common-based peer production. In the former, academic centers and institutions tend to disseminate information, whether for profit or not, for the benefit of the academe. But then again, the author had the opportunity to observe that some academic and scientific publications nowadays are somewhat restricting access only to those who can pay for the subscription fees.  While these trends reduced publication costs, there is still a question whether these organizations are for promotion of openness or not. Peer-to-peer (or P2P) production is an efficient way of collaborating with other information sources and then building the knowledge database later. The only problem with this scheme is that we rely on the presumption that all contributed pieces of information are correct and reliable. If the presumption fails, then the integrity of the system fails as well.

This is where we reach a crossroad. We are given the opportunity to move forward with new or old technology. This opportunity must be exercised with great caution to decide how the next 150 years of communication technology will be mould and crafted.

The stakes of communications structure and the organization of information knowledge and cultural production

The question is “why do we need to care?” The author’s answer: “because it involves our most cherished freedom: the right to free speech among others”.

Speech will be a useless exercise if it does not permeate the bounds of communication. If it is only a one way process, such as pointing your flashlight in the clear skies, it doesn’t do anything good. If it is the only kind of speech granted for exercise, then there is no freedom to speak of. To the author, speech which does not end up in a channel of communication is no speech at all.

In a political discourse, it is said that he who controls the pipe, controls the content. The author agrees. Telecommunication companies, for example, have anything to say about SMS, voice call or mobile internet. In the internet, the degree of flexibility of internet providers will determine the amount information users can communicate, subject to government regulations. Relevant example is the issue of “data cap” in the mobile market. For a given plan, a mobile internet user cannot access bytes of information exceeding their allowances without being charged additional fees.

In the cultural discourse, the issue is autonomy. The author likewise raise the question of “who should design the window to view the world?”. Should it be some enterprise? the government? or some other independent group? At the end of the day, it doesn’t matter. For as long as the issues are resolved, whoever they might be is irrelevant. The author adheres with a principle in solving engineering math problems that: as long as the solution leads to the correct answer, it doesn’t matter. The solution is deemed correct. So long as the general issues on hunger, poverty and injustice are solved, it may be tolerated.

Innovation should dig its way to show that improvements in the communication technology will not violate property rights. If innovation would mean an open access to end users, each and every end user must do their part to agree and mutually conform with the norms. Otherwise, there will be certain gaps in the distributed system where certain pieces of information are not permitted to be transmitted or received.

A layered view of the state of play: the content, the logical, and the physical layers

Prof. Benkler discussed the various layers of communication technology and how they are interrelated to each other. The content layer (information, movie shows, property rights) is connected to the physical layer (satellite, DSL and cable internet and the devices like TV, mobile devices and PCs) by means of the logical layer (i.e. internet protocols). In Prof. Benkler’s illustration, only a portion of content layer is allowed free access throughout. This is so because economic considerations regulate the flow of information and communication to and fro certain areas of the system.

For the physical layer, our present technology is into developing open wireless networks. In the Philippines, this may be through a national or municipal broadband network. The author agrees to the idea. Just like the government or any of its branch or political subdivision are maintaining roads, drainage, public utilities and other public services, the necessity of providing access to the internet is now more demanding than ever. This, to the author, is the better idea than adopting a distributed wireless network; provided of course, government procurement of the service is done in accordance with law and the price and terms are the most advantageous to the government. As to the devices, the author still believes that such must remain to follow the functionality of a PC. It must remain as “open source” equipment that will never restrict the contents. The author, however, disagrees on the proposal that devices must be built with discretion to allow or regulate certain contents, for it defeats the very purpose of free and open communication over the internet.

The problem of controlling or regulating the content is more real than apparent. Once an unrestricted content is brought into the physical layer, the degree of control will be almost nil. The proposed solution was to equip the devices like PCs to recognize and control to a certain extent the content to be enjoyed by the end users in the same way (cable) TVs work. But how about those contents that people are very much willing to share: like social media posts? personal videos? event photos? Is it not an infringement of the right to free speech as long as the contents are not offensive? The author believes that we must strike a balance between these conflicting interests.


The author joins Dr. Benkler in advocating for “open source” technologies. As long as the medium of communication is the internet, there must be no boundary or restriction on how people send or receive information from one point to the other. The author also believes that the distributed system should not always be taken as the result of the evolution of the “centralized system”. Sometimes, we have to accept that they are just two independent systems incompatible with each other.

The economic aspect of communication technology must fail at some point. The use of technology must not always be dictated by the laws of economics but by justice, fairness and equity. Thus, each technology user must resist any “enclosure movement” and start claiming this new “public domain” called the information and communications technology.


1 Comment

Posted by on August 28, 2015 in Technology and the Law


Tags: , , ,

Reaction Paper: “Internet Technologies”

(Reaction from the A/V presentation at MMS 201, Arellano University School of Law, Taft Ave. cor. Menlo St. Pasay City, Philippines, played on 17 August 2015)

In the video was Prof. Jonathan Zittrain of the Harvard Law School discussing on a crash course in internet technologies.

Prof. Zittrain began his introduction by anticipating that the video is either being streamed or being watch from a DVD player. He also addressed the audience as either being enrolled in an internet law program or just those who obtained a pilfered or pirated content. To the author, this is one timeless opening statement and the same time a good disclaimer. Prof. Zittrain made an opening impression that he knows the subject matter and its future as well.

The one (1) hour video tried to answer three (3) basic questions:

  • Why is it so hard to trace the people on the Net – and so easy for them to pirate with impunity?
  • Why is video streaming so unreliable?
  • Why are we so vulnerable to viruses and hacks?

The past and the future of the internet: which technology will prevail?

In laying down the foundation and to better understand the topic, Prof. Zittrain discussed a backgrounder on technologies before the internet – that every technology exists in their own sphere and totally isolated from each other, i.e. telephone runs through copper wires, cable television runs through coaxial cables and television and radio broadcasts run through radio waves. Government regulations can be implemented with convenience and without any complication.

But as the internet age came into industry level, there we significant additions to the three basic spheres and that connectivity is no longer isolated but crossing each other. The path from one medium to another is entangled with two or more systems. Satellite TVs and wireless cellular phones are introduced. Television and radio programs are no longer exclusively accessed through broadcast media. It may now be accessed through the internet passing through copper wires. Phone calls may be done over internet protocol (VOIP). Cable televisions now also support both the internet and phone calls in the same way as wireless cellular phones. This is where the challenge of regulation comes in.

Considering the video was shot and released in the year 2004, there was uncertainty then on what technology will be the future and what will be obsolete. Illustrating for example a TV box: will the TV box be obsolete in view of internet’s promise to watch cable TV programs over the internet? Or will the internet suffer failure and we go back to a more reliable TV box to conveniently watch our favorite program? Same question was posed on a personal computer. Prof. Zittrain predicted convergence of technology at home which may result to PCs being left out. But the converse may also happen such as going back to the isolated PC in one’s room or “den” because the convergence thing did not sell. There may be a “horserace” question like “who will be the winners and losers of the information age?” But as Prof. Zittrain correctly pointed out, what is more important is the ability of the government to regulate whoever are the winners of the internet age and those who use it.

There were intermediate discussions on the infrastructure of the various technologies such as the Centralized (TV and radio broadcast), Decentralized (cable TVs, telephone system) and Distributed (the internet) systems, and the trend to adopt simplicity in the interconnectivity by adopting the “hourglass architecture”. But these discussions are too technical for a common law student with minimal or no background at all in the information technology. What I believe more important is, as earlier mentioned, the aspect of regulation and how to extend the arm of the law in the use or abuse of the expanding internet technology.

Why is it so hard to trace the people on the Net – and so easy for them to pirate with impunity?

In “hourglass architecture”, each internet user is empowered to create applications and introduce any information that can be shared in the internet. The simplicity in having the task distributed actually complicated the framework of the internet. Since there is no central authority to authenticate whether information are true and correct information, such as personal details, it would be very hard to ascertain the validity and authenticity of such information. In such case, we have to rely and presume its correctness.

In addition, the internet service providers (ISPs) may not be the single loop where the information will travel. One ISP may be subscribed to another ISP subscribed in another before entering the “cloud” where a vast array of information are kept, processed and conveyed. To reach the destination of the information, it may pass through another ISP or set of ISPs before finally reaching out the intended recipient.

In the example illustrated by Prof. Zittrain, it is very intuitive indeed to obtain information from a person as in soliciting his personal details by making it appear that the solicitor is from an ISP billing section. Because the identity of the person soliciting may be made to portray another, without need of prior authentication, it is prone to abuse and is the basic modus of hacking, identity theft and piracy.

To the author, a decade after the video, the vastness of internet also made it possible to disseminate the info about these “phishing” activities. However, new users of internet technology are not so aware of this and still prospective victims of the modus. But these anonymities may also lead to something else. Like nowadays, there is already “torrent” technology that makes use of peer-to-peer networking to support high volume free downloading data to include even protected contents. The introduction of “secured certificates” may still be costly to offer a permanent solution to this problem.

Why is video streaming so unrealiable?

Prof. Zittrain explained that even if you are subscribed to an ISP with sufficient bandwidth, there is no guarantee that video streaming will be reliable. This is because the ISP can only do so much as it can extend to the cloud. At the other end, if one does not support the bandwidth as the other, there can be no reliable and seamless video transmission and reception.

One way of increasing reliability is the peer-to-peer technology which uses group resources and makes media available to two or more peers. However, it appears that this kind of open sharing scheme also opens up the gates of the household computer to many other applications including viruses and even hackers.

The unreliability may be attributed to the file size of the video itself. The conveyance of the usual office files ranging from several kilobytes to megabytes of size had no similar problem. The next nearest similar scenario is music streaming. But then again, several kilobytes per second music streaming is no problem even if you are just using mobile data packages. It is with thus unreliability that it is best to watch movies and instructional videos using DVD players rather than using the internet and stream it from a source.

The author believes that this problem may be properly addressed if one can create or derive an application which can efficiently spread and collect bits of information from the “cloud” so that no matter how good or bad the ISPs are, we can still get the same quality of service or video stream in this case. This seems to be possible because in the past, you can only download video as well as other files directly. If the connection is terminated, no file is saved. But with the advent of “torrent” technology, one can download bits of information at a time and save it in the computer until all components are complete for it to be opened, accessed or played. At any time it is interrupted, the whole process is not abandoned. Say you stop at 40%. You may proceed downloading next time to begin with 40% in no time. The only difference with this current use of technology and the author’s proposition is that in the latter the approach may be adopted to introduce an efficient rather than a convenient system.

Why are we so vulnerable to viruses and hacks?

Following the distributive system of connectivity of all computers, including mobile devices, in the internet, the process of identifying whether a program is a virus or one containing a virus becomes a tedious task. Viruses are very unique programs used to flood or disrupt the system. However, not all viruses are prima facie visible and cognizable as such. They may appear initially as inks that when you click or double-click will open the program and then spread to the computer system.

The source of the problem appears to be in the ends of the connection – the end users of internet. It is at these points where the vulnerability is higher than in the ISPs or the “cloud”. As already discussed above, the activities such as browsing an unknown page, clicking or double clicking links either arbitrarily or intentionally paves the way for a program which actually may be a virus trying to infect one’s computer, and then ultimately the network.

The author agrees with Prof. Zittrain that it is a question of every internet user’s vigilance. As Thomas Jefferson puts it: “Eternal vigilance is the price of liberty.” It’s how people use or abuse the system that affects the vulnerability of computers and the internet to viruses and hacks. To prevent such problems from happening, one must install or adopt necessary safeguards. This may include the subscription to anti-virus programs and regular updating of such, or simply limiting internet experience to the minimum or “need basis” only.


Internet technology is all about empowering everyone to use and develop the internet so that everybody enjoys the exchange of information. However, just like an “open well” shared by the community, drinking water is ensured only if everyone depending on it wary to preserve its potability. By assuming individual responsibility, we can make use the most of the internet with minimal fear of its unreliability or adverse consequences. In this wise, we are helping the government minimize its efforts to regulate the internet.

The author believes that the combined efforts of individuals who made possible the realization of the concept of internet is the same combined efforts required to develop and maintain the internet and its benefit in this fast paced modern age.

1 Comment

Posted by on August 18, 2015 in Technology and the Law


Tags: ,