400 Capitol Street, NW
Washington, DC 20001
- Toll Free 1.888.564.6273
- Local 202.783.3870
Government goes to those who show up. FreedomWorks makes it easy to hold your elected officials accountable in our fully interactive Action Center.
Find activists, groups, and events right in your own neighborhood. Join FreedomConnector to get involved and learn more about key issues threatening our economic freedom. Whether you’re looking for like-minded people, trying to boost your existing group’s impact, or simply trying to stay up on current events, FreedomConnector is the place to start. See what’s happening in your state today!Get Connected
400 Capitol Street, NW
Washington, DC 20001
A great deal of young people were attracted to Obama’s campaign due to its “hipness” and its ability to organize through social networking sites. However, a year after Obama took office, more young people are waking up and realizing that Obama’s minimum wage policies do not serve their best interest. In fact, young people have a significantly higher unemployment rate compared to any other demographic. According to the San Fransico Chroncile:
Prospects for young people are bleak: teenage unemployment is 27.1 percent.
Forbes Magazine reports that:
The unemployment rate for those under 25 stands at 19%. Even for college graduates, wages are declining even as opportunities dry up.
The unemployment rate for African American youth is even higher at 50.4% percent. Half of African American youths that are actively seeking employment are unable to get hired.
On Obama's website he states that he plans to:
raise the Minimum Wage to $9.50 an Hour by 2011.
The federal mandated minimum wage prevents young people from obtaining an entry-level position and gaining valuable skills. As economics 101 explains, businesses have a disincentive to hire inexperienced young workers if they are forced to pay them at least $7.25 an hour. As a result, a willing potential young worker may stay inexperienced and unemployed due to the government’s intervention.
According to Joe Sabia, an Associate Professor in Public Policy at American University:
A 10% increase in minimum wage reduces retail employment by 1%, and reduces employment among young workers by 3.4%. Obama’s proposal would raise the federal minimum wage by over 30%, causing even greater job loss at a time when our economy can least afford it.
A Wall Street Journal article confirms these findings:
Two years ago Mr. Neumark and William Wascher, a Federal Reserve economist, reviewed more than 100 academic studies on the impact of the minimum wage. They found 'overwhelming' evidence that the least skilled and the young suffer a loss of employment when the minimum wage is increased.
Individuals that begin working as teenagers are put at a considerable advantage. Studies have shown that workers who do not start working as teenagers will ultimately suffer from longer periods of unemployment and lower long-term wages. High teenage employment has numerous positive externalities-- employed teenagers have a better chance at having a successful productive future career and are less likely to commit crimes. In addition, they are able to gain job experience, a sense of responsibility and money management skills while building their resumes. Youth unemployment is already at the highest levels since WWII, any minimum wage hike will result in the youth losing more jobs at an unprecedented rate.
A great deal of young people were attracted to Obama’s campaign due to its “hipness” and its ability to organize through social networking sites. However, a year after Obama took office, more young people are waking up and realizing that Obama’s minimum wage policies do not serve their best interest.
In December, the House passed the second "stimulus" plan. The proposed plan is now before the Senate and will redistribute $174 billion taxpayer money throughout the economy to "create" jobs. The second stimulus, heavily backed by democratic lawmakers, remains unpopular among American citizens that have yet to see any positive effects from the first stimulus. These Americans do not trust Congress to allocate their hard-earned money in an efficient way to help spur job growth. According to a new CNN poll, 3 out of 4 Americans believe that the majority of the money in the first stimulus plan was wasted:
A CNN/Opinion Research Corporation survey released Monday morning also indicates that 63 percent of the public thinks that projects in the plan were included for purely political reasons and will have no economic benefit, with 36 percent saying those projects will benefit the economy.
Twenty-one percent of people questioned in the poll say nearly all the money in the stimulus has been wasted, with 24 percent feeling that most money has been wasted and an additional 29 percent saying that about half has been wasted. Twenty-one percent say only a little has been wasted and 4 percent think that no stimulus dollars have been wasted.
A poll by Rasmussen Reports finds that:
56% of Americans oppose the passage of another economic stimulus package.
The majority of Americans recognize that massive increases in government spending are detrimental to the economy. The members of Congress need to be fiscally responsible and representative of their constituents-- and stop the continuous passing of wasteful and inefficient bills-- or their jobs will be on the line in November.
In December, the House passed the second "stimulus" plan. The proposed plan is now before the Senate and will redistribute $174 billion taxpayer money throughout the economy to "create" jobs. The second stimulus, heavily backed by democratic lawmakers, remains unpopular among American citizens that have yet to see any positive effects from the first stimulus.
According to the Associated Press, the White House interpreted Senator Scott Brown's victory in Massachusetts as a sign to start focusing on jobs:
Their conclusion was that the economy — jobs specifically and the broader topics of the nation's fiscal and financial health — must be priority No. 1.
Massachusetts unemployment rate has risen to 9.4 percent which is slightly below the nation's average 10 percent unemployment rate. Reports claim that Obama will now diverge his focus onto promoting more of his costly economic "recovery" plans. Which include:
—New spending for highway and bridge construction.
—Tax cuts for small businesses that increase their payrolls
—Money to retrofit millions of homes to be more energy-efficient and create "green" jobs.
—Funds to help state and local governments avert layoffs of public-sector employees.
Various political analysts claim that Republican Senator Brown's successful race confirms that Massachusetts citizens are increasingly discouraged by the Obama administration's failing economic policy. These latest reckless spending proposals do not differ from Obama's first failed massive economic stimulus plan. The proposed tax cuts will help small businesses; however, these tax cuts should be extended to all businesses in order to significantly increase payrolls. It is predicted that Obama's "new" agendas will also be as wasteful and ineffectual at stemming job growth. According to the Washington Times,
Notwithstanding the shift in terminology, the president's new plan is nothing more than another serving of his old, failed stimulus plan. It calls for another massive dose of "infrastructure spending" — up to $50 billion more for roads, trolleys, trains and sewer systems. Such spending has proved one of the most impotent components of last year's American Recovery and Reinvestment Act.
Newly elected Scott Brown ran on the principles of fiscally responsibility. By electing him, the message that Massachusetts sent was clear and simple: stop spending. Obama uncontrollably throwing billions more tax-dollars at the economy hoping for a different outcome is not what the American people desire. The American people want job growth. This feat can only be accomplished by lowering taxes and spending less.
According to the Associated Press, the White House interpreted Senator Scott Brown's victory in Massachusetts as a sign to start focusing on jobs:Their conclusion was that the economy — jobs specifically and the broader topics of the nation's fiscal and financial health — must be priority No. 1.
FreedomWorks has joined efforts with the Competitive Enterprise Institute, the Discovery Institute, the Heartland Institute, and the Heritage Foundation to file comments opposing the FCC's proposed rules to create new regulations on the internet in the name of net neutrality.
With these joint comments the coalition wishes to express grave concern about the proposed rules, which we believe are unnecessary and may actually harm both the Internet and consumers.
FreedomWorks believes that the internet is dynamic and constantly evolving. Imposing regulations will impede innovation and investment by broadband providers, threatening both broadband deployment and network management.
FreedomWorks President Matt Kibbe added, "As the Internet continues to develop it will play an even more critical role in our lives. The elephant in the room of this debate is that new Internet applications, from moving X-ray images between hospitals to watching movies, will require more bandwidth and investment. The proposed net neutrality regulations would stall new investment by giving bureaucrats ultimate authority over Internet innovation. That's the wrong approach. The internet has been a success because it evolved relatively free from government regulation, and delivering the next generation of services requires innovation, not regulation.”
The comments of the free market coalition have been posted to the FreedomWorks website and are available at http://www.freedomworks.org/publications/proposed-regulations-pose-threat-to-internet
FreedomWorks is a grassroots organization with over 800,000 members nationwide dedicated to lower taxes, less government, and more freedom.
For further information, contact Adam Brandon at 202-783-3870.
FreedomWorks has joined efforts with the Competitive Enterprise Institute, the Discovery Institute, the Heartland Institute, and the Heritage Foundation to file comments opposing the FCC's proposed rules to create new regulations on the internet in the name of net neutrality. With these joint comments the coalition wishes to express grave concern about the proposed rules, which we believe are unnecessary and may actually harm both the Internet and consumers.
Federal Communications Commission
Washington, D.C. 20554
In the Matter of )
Preserving the Open Internet ) GN Docket No. 09-191
Broadband Industry Practices ) WC Docket No. 07-52
Wayne Brough Chief Economist, FreedomWorks Foundation
James Gattuso, Senior Fellow in Regulatory Policy, The Heritage Foundation
Hance Haney, Senior Fellow, The Discovery Institute
Ryan Radia, Associate Director of Technology Studies, Competitive Enterprise Institute
James G. Lakely, Co-director, Center on the Digital Economy, The Heartland Institute
In accordance with the public notice issued by the Commission on October 22, 2009, we respectfully submit these joint comments regarding the Notice of Proposed Rulemaking issued in the above dockets. We wish to express our grave concern about these new rules, which we believe are unnecessary and threaten to harm both the Internet and consumers. Specifically, we address four primary issues: the critical and changing nature of network management, competition in broadband marketplace, possible harm from the proposed limits on transparency, and the lack of statutory jurisdiction for Commission action in this area.
Innovative Network Management Requires Flexibility, Not Regulation
As noted in the NPRM, the proposed rules for an open Internet raise “important questions” with respect to network management. Indeed, critical decisions about the structure and development of the Internet may become encumbered with regulatory oversight should the proposed rules be codified. As is becoming increasingly evident with the introduction of more bandwidth-intensive content and applications, scarcity does exist within the networks that provide Internet access. With an increasing number of subscribers coming online (and the stated policy goal of accelerating this expansion), broadband providers must adapt standards and practices to address this binding constraint. This requires the flexibility to innovate to ensure that existing resources are being used as efficiently as possible.
As the Internet has become more ubiquitous, how it is being used is also changing, making concerns over network management even more important. The shift from email to BitTorrent has put new strains on the Internet that will only increase as more users and more bandwidth-intensive applications become available. One recent study found that BitTorrent, a peer-to-peer network application, was already responsible for more than 30 percent of Internet traffic by the end of 2004. Overall, peer-to-peer traffic is currently responsible for more than 60 percent of Internet traffic and poses real concerns for Internet service providers, and can affect service quality for all subscribers. Internet telephony and IPTV are a growing segment of Internet traffic and -- as many have noted -- the amount of data being moved across the Internet raises significant concerns over carrying capacity.
The transformation that has occurred on the Internet has generated new challenges. The emails and file transfers in the early years did not create the latency and jitter (delays and problems in the data at the end of the pipe) that today’s content and applications can generate. The shift from text files to streaming video and Internet telephony has highlighted the importance of quality of service, in both the technical and in the layman’s sense. As one commenter wrote:
“As originally envisaged, the Internet was not designed for services with real time requirements…. [A]s opposed to the traditional telephone network's use of switched circuits that are opened for the duration of a telephone conversation, the Internet uses datagrams that may have other datagrams just in front and behind that belong to different communication sources and are destined to terminate on a different host.”
Moving forward, then, two tasks are required to create the networked world of the future. First, broadband providers must continue to invest in the fundamental infrastructure of the Internet. Doing so requires investors to allocate billions of dollars to extend networks to reach rural and other customers.
In addition, there is also a growing need for improved network management. Bandwidth-intensive applications and content that place new strains on the Internet threaten to diminish the benefits for everyone. Not only is more bandwidth required, but that bandwidth must be managed more efficiently. New technologies for traffic shaping and deep packet inspection provide new management tools for the Internet, allowing broadband providers to more efficiently allocate network resources. The proposed rules, however, constrain the development of new mechanisms for better network management. Instead, they propose to regulate access, something that may limit innovation. By codifying principles of neutrality based on the “end-to-end” view of the Internet, they would effectively freeze the internet’s development at a stage that may be inappropriate for future use patterns.
New net neutrality requirements would generate a typical commons problem. That is, whenever prices are set too low, there is a predictable outcome. Economics and history tell us that a good whose price is set below a market-clearing level will lead to a shortage in which the quantity demanded will exceed the quantity supplied. By banning tiered pricing and packet prioritization, net neutrality rules would lead to a world of congestion. As in any situation with an under-priced good, there is little, if any, incentive to innovate, expand, or upgrade.
This is especially true when examining the emerging applications such as IPTV, gaming, telephony, distance learning, and medical technologies, all of which require more data to be moved with greater levels of effort to ensure quality of service. And it is not as if these applications are replacing many of standard uses of the Internet, such as email and file sharing. These activities remain popular and will actually expand as broadband deployment brings more people online. The new applications and social networking under development are an expansion on top of the existing uses of the Internet, requiring even more data to be delivered to an expanding set of applications and increasing number of users.
By contrast, Christopher Yoo suggests that network differentiation, rather than network neutrality, may be the best approach to increasing consumer welfare. Already, retailers, content providers, and applications developers are experimenting on the web, pushing technology to its limits. Just as significantly, they are also experimenting with pricing for content and applications—much of which would be challenged by the stated principles of net neutrality.
At the same time, a more dynamic Internet creates more opportunities for entrepreneurs. A faster and more agile Internet paves the way for more powerful applications and a larger audience. It must be remembered that those upgrading the Internet can only generate profits by providing a product that consumers are willing to purchase. Investments in the future of the Internet are not made to deter consumers; rather, expanded deployment and better traffic management are efforts to bring more consumers online. This encourages more participation by consumers and more opportunities for developers of both content and applications.
Far from killing the next Google, then, a more developed and differentiated Internet offers a host of new opportunities for the next generation of entrepreneurs. Importantly, it would allow entrepreneurs and developers to take full advantage of the latest standards to bring consumers a better online experience. Web applications such as YouTube, BitTorrent, and IPTV require the full capabilities of high-speed broadband access.
Moreover, the rules being proposed would necessarily put regulators put regulators in the difficult position of having to review and adopt standards for acceptable levels of such things as latency and jitter to ensure that each broadband providers “best effort” for moving data forward is nondiscriminatory. In effect, the new mandates would put a federal agency in the role of determining which product should be available at what price, paving the way for a long and active regulatory chapter in the development of the Internet.
Open Internet Regulation Could Frustrate the Goal of Universal Access to Faster Broadband
By limiting how Internet service providers charge for access, they may also be causing harm in another way. Broadband access is provided in a multisided market, in which distinct groups share a common platform and benefit from each other’s participation. New broadband subscribers create value for existing and future subscribers by lowering the average network cost per connected customer: “Twice as many customers connected allows costs to be spread in a way that reduces by half the cost borne by each customer,” according to Larry Darby. Value is also created for service and application providers whose business models are related to the number of subscribers, “eyeballs,” hits or other audience-related metrics.
Larry Darby estimates the consumer welfare gain would be $8 billion over 10 years if content providers shared 10 percent of the common costs of constructing fiber-to-the-home (FTTH) to under one million households in the first year and up to 28.3 million homes in the tenth year.
J. Gregory Sidak calculates savings of $3 billion to $6 billion per year if new sources of revenue allowed broadband providers to reduce access prices to the then-existing base of broadband subscribers (50.2 million households) by $5 to $10 per month.  Sidak also concluded that an additional 14.3 million homes would subscribe to broadband access in response to a $5 per month price reduction, and an additional 28.6 million homes would subscribe to broadband access in response to a $10 per month price reduction.
Innovative pricing or bundling models could help providers achieve these benefits. Instead of charging consumers for access to advertising, for example, broadband providers might charge advertisers for enhanced access to consumers. Broadband providers might lower the prices consumers pay for such access so more consumers sign up. Lowering consumer prices for broadband access would benefit access, application and service providers in addition to consumers.
Advertising revenues support both content and delivery in other media, enabling providers to charge consumers little or nothing. There is no reason to limit the practice on the Internet, particularly in view of the enormous investments that will be necessary to achieve universal access to fast broadband. A non-discrimination rule will create a loss in consumer welfare by making it more difficult to spread the cost of maintaining and upgrading a rapidly growing Internet.
Addressing suggestions by some that fees on content providers should be banned altogether, Alfred Kahn testified:
[F]ail to comprehend—or choose to ignore—that the market here is “two-sided”—providing Internet content and services to consumers and the attention of consumers to advertisers. It makes no more sense, therefore—and is clearly misguided for consumer advocates—to want to forbid the broadband access suppliers that carry those advertising messages charging the advertisers for access to the public than to require newspapers, television broadcasters or cable companies to obtain their revenues exclusively from readers, viewers or subscribers.
Despite the fact that massive investment will be required in all segments of the Internet, and the Internet is a multisided market where all groups derive value from each other’s participation, net neutrality regulation would limit innovative pricing or bundling. If net neutrality regulation sets current revenue models into place forever, it will force broadband service providers to recover more of the cost of upgrading and maintaining the Internet from consumers. Policymakers should be mindful of consumer welfare losses as they consider whether net neutrality regulation is in the ”public interest” or merely in the self interest of some but not all groups in a multisided market.
The Broadband Market is Competitive
While the NPRM does not rest its case for regulation on a lack of competition, it does suggest that market power is a major factor in justifying the regulation.  Yet, there is competition in this market, and it is growing. According to the FCC’s own 2009 report on high-speed services, high-speed DSL connections are available to 83 percent of the households to whom local telephone service is available, and high-speed cable modem service is available to 96 percent of households passed by cable. Thus the vast majority of Americans have a choice of wired broadband providers.
If not a monopoly, then is broadband a “cozy duopoly,” in which two firms share market power to the detriment of consumers? The data, and the rivalry between the major players, suggests not. Certainly, the market is relatively concentrated, with two leading providers holding a high market share in most markets, with a number of smaller competitors. But this market structure is similar to that of many other industries where competition is anything but cozy.
Coke and Pepsi, for instance, dominate the soft drink market but compete intensively against one another, with several smaller providers at their heels. Likewise, the supermarket industry, one of the most competitive in the economy, features two major supermarket chains in many cities, along with a number of smaller players, often with a specialized niche—such as club purchases, organic and natural foods, or gourmet fare.
Similarly, in broadband, telephone and cable companies compete intensely against each other for broadband customers on price and quality. Importantly, this competition is not just in the broadband market itself but also in other, related markets where the two industries are rivals. Using Internet-based telephony technology, for instance, cable firms are increasingly challenging the traditional telephone companies’ share of that market. At the same time, Verizon and AT&T are entering video markets, using their broadband networks, challenging traditional cable firms.
A bevy of smaller providers and technologies also keep this market in check. New providers offer alternative avenues to consumers, with WiMax creating the potential of wireless networks covering entire towns or cities. Other technologies under development, such as broadband over power lines (BPL), also hold promise for future deployment and new competitors.
And the market share of many of these alternatives is growing: According to the Pew Internet and American Life Project, from 2007 to 2009, the percent of homes with fixed satellite or wireless broadband increased from 8 percent to 17 percent, more than a twofold increase. In all, 43 percent of Americans report they have more than two broadband choices available to them. Not all of these technologies, because of slower speeds or other limitations, are perfect substitutes for the major services. But by providing niche or specialized services, they do provide competitively significant alternatives.
Moreover, the wireless broadband market in the United States is characterized by intense competition in all areas, including applications, devices, and network management (including spectrum efficiency). Over 95 percent of Americans live in census blocks with at least three wireless providers, and more than 60 percent live in census blocks with five or more providers. In addition, the number of cell phone subscribers has increased, prices are falling, and quality and offerings are expanding—hardly signs of a non-competitive market. As the Commission itself concluded in its most recent wireless competition report:
U.S. consumers continue to reap significant benefits – including low prices, new technologies, improved service quality, and choice among providers – from competition in the CMRS marketplace, both terrestrial and satellite CMRS. The metrics below indicate that there is effective competition in the CMRS market and demonstrate the increasingly significant role that wireless services play in the lives of American consumers. In particular, these metrics indicate that wireless technology is increasingly being used to provide a range of mobile broadband services.
Both wireline and wireless markets are competitive, dynamic and expanding. Blocking particular websites or limiting web access, common assertions of made by net neutrality advocates makes little sense in such markets. Competition requires providing consumers with what they are seeking: the best online experience possible. For broadband providers, then, blocking websites or limiting access does not increase broadband penetration.
In addition, it must be remembered that large content and application providers may exhibit a degree of market power themselves. For, example, if Verizon tried to block a popular site such as Amazon or Google over disagreements on pricing, consumers may abandon Verizon in favor of a broadband provider that does connect to sites that they demand, which raises an important issue. Namely, where does the monopoly threat end? If, in fact, Google, with more than half the market for Internet searches, does have market power, is regulation required in the name of net neutrality? Similarly, even content providers may possess market power, with a provider such as ESPN able to extract surplus rents from ISPs.
A firm in any component of the Internet can expand to provide new services, as has been witnessed by Google’s jump from applications to content with the $1.65 billion purchase of YouTube and now to hardware with the introduction of a new mobile phone. The major concern is whether a firm can leverage market power in one area into market power elsewhere. Proponents of new open Internet regulations assert that broadband providers have monopoly power in the last mile that could be used to create market power in the content layer. But Google’s search engine had a market share of over 82 percent in December 2009; indeed, the two top search engines had a combined market share of roughly 90 percent, but it would be difficult to demonstrate that this warrants new regulations. All of these situations can be addressed by existing antitrust laws should abuses occur. These are phenomena that are general to any market and no legislation or special rules are necessary.
Google has leveraged itself into the content market with YouTube and other services. With respect to advertising, the search giant is clearly engaged in a two-sided market, matching advertisers to consumers using algorithms to extract as much surplus as possible. While this may be an example of an application that is expanding into the content market, it does not seem to suggest that Google is abusing market power. Similarly with broadband providers, in the absence of evidence of abuse, new regulations are not required. The existing antitrust laws are more than ample to address any such concerns.
As the online world emerges and evolves, questions of market power may arise that cut across all market participants. Net neutrality has, to date, focused primarily on the physical infrastructure. Yet the diversity of potential sources of market power suggests that the existing antitrust laws embody a degree of flexibility that makes them more aptly suited to addressing any problems than would an ex ante, proscriptive “open Internet” regulatory regime.
Need For Mandated Transparency Not So Clear
The Commission has proposed a transparency rule that would require broadband Internet providers to disclose information about their network management practices. The NPRM notes that various broadband Internet service providers currently offer a range of information about their network management practices (and about other service aspects such as bandwidth and usage limitations). Some providers only explain their network management practices in broad strokes and do not provide many specifics. Other providers make available details about how they manage applications and network congestion. The Commission regards the lack of consistent network management transparency as problematic.
Yet the absence of universal disclosure is hardly unusual in a competitive market setting characterized by differentiated service offerings. Internet service providers compete with each other not only based on speed, reliability, and price, but also on network management practices. To the extent that consumers prefer providers that disclose their network management practices, consumers can weigh network management transparency against other aspects of service in deciding which broadband provider to patronize.
In proposing a network management transparency principle, the Commission fails to appreciate the extent to which proprietary product elements can be legitimate, and sometimes crucial, features of a competitive marketplace. Particularly in high-tech industries, it is commonplace for competing firms to offer services to consumers that “just work” without making public intricate details about these services’ underlying functionality. Google, for instance, does not disclose its search algorithm – known as Google’s “secret sauce” – which is used to determine page rankings on its search engine. While some consumers would probably better equipped to select among competing search engines were all search algorithms open to the public, and many content providers could undoubtedly attract larger audiences were search results easier to “game,” no search engine in widespread use currently discloses its search ranking algorithm. One major reason for this secrecy is that search businesses overwhelmingly prefer to maintain trade secrets that enable them to acquire and maintain economic advantage over competitors. The ability to develop non-public algorithms that deliver superior results is a competitive weapon, incentivizing firms to invest heavily in building better tools for indexing the Web.
Broadband providers’ network bandwidth is generally a scarce resource, and providers cannot always fulfill all data transfer requests instantaneously. Determining how to allocate scarce network resources is a complex business decision, and providers are constantly experimenting with a range of approaches toward managing network congestion. While certain network management practices have been deemed by the Commission to violate the non-discrimination principle, the proposed transparency principle would likely require the disclosure of all network management practices, discriminatory or otherwise.
While the NPRM expounds at great length on the innovation that is continuously occurring in the content sphere that sits atop the Internet, there is scarcely any mention of innovation in the network operations sphere, which includes network management techniques. Just as forcing search engines to disclose their algorithms would likely discourage future innovation in online search, mandating network management disclosure might stifle the development of novel approaches to network management.
The NPRM identifies a number of potential benefits that would purportedly arise from greater disclosure by Internet service providers of their network management practices. However, the Notice fails to account for potential market distortions that might result from a transparency principle. The Commission observes that “a large number of commentators on open Internet principles in our Broadband Industry Practices proceeding…believe that broadband Internet access service providers should be required to disclose more information about their network management practices than they currently disclose.”
Despite the abundance of support among commentators for greater network management transparency, however, the Commission must consider the possibility that these “information asymmetries” in broadband might be better understood as efficient asymmetries that reflect that a competitive and innovative marketplace.
In analyzing the costs and benefits of implementing a transparency principle, the Commission must carefully consider various reasons why some providers might prefer not to disclose their network management practices. The Commission acknowledges that imposing onerous disclosure mandates runs the risk causing consumer confusion and saddling Internet service providers with undue burdens. However, the Commission must also take into account the competitive reasons and other reasons why mandatory network management disclosure might be undesirable in the broadband market.
The NPRM states that “access to information plays a vital role in maintaining a well-functioning marketplace that encourages competition, innovation, low prices, and high-quality services.” Mandating access to such information, however, is squarely at odds with the Commission’s goal promoting ubiquitous, affordable broadband service.
To the extent that the market for last-mile broadband Internet service in some areas suffers from inadequate competition, mandating network management transparency as a means of addressing perceived “market failure” will only compound consumers’ woes. Especially in sparsely populated areas, the cost of entering the broadband market can be quite substantial. Imposing federal regulations that constrain the flexibility of broadband providers to develop economically viable services would give would-be market entrants one more reason not to invest in bringing wireless or wireline service to underserved areas.
The NPRM also omits mention of the myriad network management information sources available to consumers other than service providers themselves. In fact, the complaint that the Commission received in November 2007 regarding Comcast’s network management practices was based on network management data initially obtained by an individual Comcast subscriber and subsequently verified by the Associated Press and the Electronic Frontier Foundation.
Especially since the Comcast network management controversy made headlines in 2007, an array of bloggers, news reporters, advocacy groups and others vigilantly monitor the network management practices of broadband service providers. While such public scrutiny does not have the force of law, it is nevertheless a crucial element of market discipline that checks potentially anti-consumer behavior by service providers.
These disciplinary mechanisms inherent to competitive markets are far more flexible and fast-moving than federal regulations, and the Commission should err against regulating well-functioning markets simply because of a handful of “mistakes” have been made over the years by competing firms. As consumer preferences continue to evolve, service providers can and will amend their network management disclosure policies.
The Commission Has No Statutory Authority to Regulate ISPs
Regardless of the merits of the proposed rules, there is serious doubt as to whether the Commission has the statutory authority to adopt them. Nowhere in the Communications Act does Congress grant authority to the FCC to regulate the Internet.
The NPRM nevertheless asserts such jurisdiction, primarily using the doctrine of “ancillary jurisdiction.” This court-defined doctrine, itself to be found nowhere in the text of the Communications Act, holds that the Commission can in matters that fall within its general statutory grant of jurisdiction and are “necessary to ensure the achievement of the Commission’s statutory responsibilities.” The doctrine was originally employed to allow the Commission to regulate cable television retransmission of broadcast signals, and later used to justify a number of other actions, including pre-emption of state regulation of consumer premises equipment.
It is in itself a remarkable legal theory, allowing a regulatory agency to act in areas where there is no grant of authority, simply because it is related to an area in which authority has been granted. In a very real sense, it is a “horseshoes and hand grenades” doctrine, in which close is good enough to count.
Even within the framework of ancillary jurisdiction, however, the case for jurisdiction in this case is startlingly tenuous and dangerously broad.
In the NPRM, the Commission asserts that the proposed regulations on Internet service providers are ancillary to responsibilities under sections 230(b) and 706(a) of the Communications Act. Section 230(b) sets forth a number of general policy goals, including “the development of technologies which maximize user control over what information is received by individuals…who use the Internet…” Section 706(a) requires the Commission to “encourage the deployment…of advanced telecommunications capability to all Americans…”
There are several problems with this, however. First, Section 230(b) doesn’t actually confer any responsibilities for the Commission.. It is merely a set of findings preceding a grant (in part (2)) of immunity for blocking of offensive material. And while section 706 directs the Commission to “encourage” the development of technologies, it provides for the Commission to do so through a regular Notice of Inquiry, after which it would issue a finding if technology is not being adequately deployed. The Commission has never issued such a finding. Moreover, as Barbara Espin of the Progress and Freedom Foundation points out, the Commission itself has held that section 706 does not provide any additional regulatory jurisdiction.
The Commission also cites Title III of the Act as a source of its authority in this area. But Title III, at best, only provides jurisdiction over wireless Internet providers. Without considerable logical gymnastics, it would not support regulation of wired carriers.
It should also be noted that the authority the NPRM claims stems from each of these provisions is apparently unlimited. Their language cited provide no definable limits on what may be done to “develop technologies” or “encourage deployment.” The Commission offers no boundaries for what may be done. Any action – up to and including full-scale common carrier regulation of Internet Service Providers – seemingly is authorized. Such as undefined grant of power would be dangerous, if not unconstitutional, and is unlikely to have been the intent of Congress.
Moreover, somewhat desperate search for some clause or subsection of the Communications Act conferring jurisdiction to the Commission is in itself an indication that Congress never intended to provide such authority. The proposed regulations, all agree, would be significant in scope and effect, and a major expansion of the Commission’s ambit to a new, and up to now relatively unregulated, sector of the economy. It is unlikely, in fact nearly beyond belief, that Congress would confer authority to impose such regulation in an unrelated backwater of the Communications Act, requiring the skills of Sherlock Holmes to find.
Congress typically does not play “hide and seek” with jurisdiction. If it did want the Commission to be able to impose regulations on the Internet it easily could have said so. And if the Commission decides that such regulation is needed, it should request clear authority from Congress.
Importantly, the lack of Commission authority to regulate ISP’s does not necessarily leave a gap in regulatory protection for consumers. Other agencies, most notably the Federal Trade Commission, have the expertise, authority and responsibility to act to prevent market abuses by Internet service providers. In fact, given the nearly 100 years of experience by the FTC in identifying and addressing market abuses through application of the competition laws, that agency is in fact better positioned to handle this issue than is the FCC.
Should the Commission not be willing to drop its assertion of authority in this area, there are several steps that still could be taken which would clarify the issue and avoid confusion. First, it should defer a final decision until a decision is reached by the D.C. Court of Appeals in Comcast v. FCC. In that case, an appeal from the Commission’s 2008 ruling that Comcast violated the current net neutrality principles, the question of jurisdiction has been squarely raised. A slight delay in action in this docket would likely help avoid confusion and uncertainty. Second, the Commission should fully and explicitly explain what limits – if any – there are on its claimed authority over Internet services. Both steps would help reduce uncertainly, and provide a clearer basis for moving forward.
Senior Fellow in Regulatory Policy
The Heritage Foundation
The Discovery Institute
Associate Director of Technology Studies
Competitive Enterprise Institute
James G. Lakely
Co-director, Center on the Digital Economy
The Heartland Institute
 Parker, Andrew. “P2P in 2005,” CacheLogic Research (Aug. 2005), available at http://www.cachelogic.com/home/pages/research/p2p2005.php.
 The Wik Consult, The Economics of IP Networks – Market, Technical and Public Policy Issues Relating to Internet Traffic Exchange, Study for the European Commission (May 2002), p. 36.
 Yoo, Christopher S. “Beyond Network Neutrality,” Harvard Journal of Law and Technology, vol. 19, no. 1, (Fall 2005).
 Specifically, it should be noted the NPRM would allow broadband Internet access service providers to charge subscribers different prices for different services, but could not charge content, application or service providers for enhanced or prioritized service. NPRM ¶ 106. Yet, in the next line, the NPRM states that the rule would not prevent a broadband provider from charging different prices for different services. This is contradictory: enhanced or prioritized service is a different service.
 Darby, Larry F., Consumer Welfare, Capital Formation and Net Neutrality: Paying for Next Generation Broadband Networks, American Consumer Institute (Jun. 6, 2006) available at http://www.handsoff.org/hoti_docs/studies/ACI060606.pdf at 38 (“Because of the externalities among different sides, platform providers cultivate all sides. Thus, newspapers need readers and advertisers; broadcast networks need station affiliates, program producers, viewers, and advertisers; credit card companies need cardholders and participating merchants; Internet search engines providers need searchers, content, and advertisers; and so on …. the Internet is comprised of agents that both receive value from and confer value upon other agents.”)
 Id., at 28.
 Id., at 9 (citing Hagui, Andre).
 Sidak, J. Gregory, A Consumer-Welfare Approach to Network Neutrality Regulation of the Internet, Journal of Competition Law and Economics, Vol. 2, No. 3 (September 2006) available at http://jcle.oxfordjournals.org/cgi/content/abstract/2/3/349 at 464-66.
 Kahn, Alfred E., Prepared Remarks, Federal Trade Commission (February 13, 2007) available at http://www.ftc.gov/opp/workshops/broadband/presentations/kahn.pdf at 6. (Kahn is the Robert Julius Thorne Professor of Political Economy (Emeritus) at Cornell University who has also served as chairman of the New York Public Service Commission, chairman of the Civil Aeronautics Board, Advisor to the President (Carter) on Inflation, and chairman of the Council on Wage and Price Stability.)
 NPRM, ¶ 67 et. seq.
 Federal Communications Commission, Wireline Competition Bureau, Industry Analysis and Technology Division, “High-Speed Services for Internet Access: Status as of June 30, 2008” (July 2009) available at http://www.fcc.gov/Daily_Releases/Daily_Business/2009/db0723/DOC-292191A1.pdf.
 Gifford, Raymond. “Signs of a Not So Cozy Duopoly,” Progress and Freedom Foundation Blog (June 30 2005), available at http://blog.pff.org/archives/2005/06/signs_of_a_not.html#more.
 Horrigan, John, “Home Broadband Adoption 2009,” Pew Internet & American Life Project(June 2009), available at http://pewInternet.org/~/media//Files/Reports/2009/Home-Broadband-Adoption-2009.pdf
 Federal Communications Commission, Thirteenth Annual Report and Analysis of Competitive
Market Conditions With Respect to Commercial Mobile Services” (Jan. 16, 2009), available at http://hraunfoss.fcc.gov/edocs_public/attachmatch/DA-09-54A1.pdf, ¶.2.
 Id., ¶1.
 See, e.g., aff, Adam. “Search, But You May Not Find,” New York Times, (Dec. 28, 2009), p. A27. The author makes the argument for additional regulations to cover “search neutrality,” given the market dominance of Google’s search engine.
 “Google To Acquire YouTube for $1.65 Billion in Stock,” (Oct. 9, 2006), available at http://www.google.com/press/pressrel/google_youtube.html.
 Market Share by Net Applications (accessed Jan. 14, 2010), available at http://marketshare.hitslink.com/search-engine-market-share.aspx?qprid=4&qpdt=1&qpct=3&qpcal=1&qptimeframe=Y&qpsp=2010.
 NPRM, ¶ 123.
 Collett, Stacy. “Cracking Google’s ‘secret sauce’ algorithm” ComputerWorld, (March 14, 2007), available at http://www.computerworld.com/s/article/9012943/Cracking_Google_s_secret_sauce_algorithm.
 Szoka, Berin. “First Amendment Protection of Search Algorithms as Editorial Discretion,” Technology Liberation Front (June 4, 2009), available at http://techliberation.com/2009/06/04/first-amendment-protection-of-search-algorithms-as-editorial-discretion/.
 NPRM, ¶¶ 62-63.
 NPRM, ¶ 122.
 NPRM, ¶ 126.
 NPRM, ¶ 118.
 Park, Eun-A and Taylor, Richard. “Barriers to Entry Analysis of Broadband Multiple Platforms:
Comparing the U.S. and South Korea,” Telecommunications Policy Research Conference Working Paper (Oct. 1, 2006), available at http://web.si.umich.edu/tprc/papers/2006/636/TPRC2006BarriersToEntry.pdf.
 Bode, Karl. “The EFF ‘Test Your ISP’ Project,” DSL Reports (Nov. 28, 2007), available at http://www.dslreports.com/shownews/The-EFF-Test-Your-ISP-Project-89789.
 NPRM, ¶¶ 83-87.
 FCC v. Midwest Video Corp., 440 U.S. 689 (1979).
 Esbin, Barbara. “Jurisdiction: The $64,000 Question,” Progress Snapshot Vol. 5, No. 12 , (Nov. 2009), available at http://www.pff.org/issues-pubs/ps/2009/ps5.12-jurisdiction-64000-dollar-....
 Speta, James B. “The Shaky Foundations of the Regulated Internet,”, 8 J. on Telecomm. & High Tech Law. 101 (2010), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1529318.
Before the Federal Communications Commission Washington, D.C. 20554
Over the past few days, Obama has held extensive close-door meetings with democratic leaders to discuss how to pay for his massive health care plan. Politico reports that:
Those involved in the talks sought to keep details of their progress under wraps. The negotiations delved into the Senate’s controversial tax on expensive insurance plans, which unions and House members strongly oppose, and, for a while Wednesday, labor leaders huddled privately with top administration aides.
Reports released yesterday announce that democrats have reached a deal with union leaders to tax expensive employer-provided health insurance plans also known as the “Cadillac Tax.” Barely any details have been released thus far. However, CNN confirms that democrats:
…have reached a tentative deal with labor unions on restructuring a tax on high-end insurance plans, a key proposal to pay for health care reform, according to two sources familiar with the fast-moving talks.
According to Thomas Firey, a scholar at the Cato Institute, this damaging “Cadillac Tax” would:
[A]dd a 40% excise tax to any amount above $8,500 paid for an individual worker’s coverage, or above $23,000 for a worker’s family. Labor leaders claim that a quarter of unionized workers would be subject to the tax, and government analysts estimate that 22 percent of all workers would be subject to it in 10 years.
It is a common practice for employers to increase their employee’s health benefits as an alternative to increasing their wages. This tax has the potential to negatively affect workers from various income levels. The Washington Post reports that such workers are staunchly opposed to the proposed excise tax:
[T]he proposal has infuriated many House members as well as union leaders, who argue the tax could strike deeply into the middle class to affect small businesses and older workers as well. They also complained that the tax, which would raise nearly $150 billion over the next decade, was designed to strike an ever-growing number of health policies.
Looking back to 2008, Obama declared:
I can make a firm pledge. Under my plan, no family making less than $250,000 a year will see any form of tax increase. Not your income tax, not your payroll tax, not your capital gains taxes, not any of your taxes.
But this “Cadillac Tax” would be passed on to middle-class workers in the form of higher premiums. Such a tax would be just one more of the many campaign promises that the President has broken.
Over the past few days, Obama has held extensive close-door meetings with democratic leaders to discuss how to pay for his massive health care plan. Politico reports that:Those involved in the talks sought to keep details of their progress under wraps. The negotiations delved into the Senate’s controversial tax on expensive insurance plans, which unions and House members strongly oppose, and, for a while Wednesday, labor leaders huddled privately with top administration aides.
In 2008, Obama stated that the health care debate would be open and televised:
But here’s the difference: I’m going to do it all on C-SPAN so that the American people will know what’s going on.
Despite Obama's promise, the health care debate has continued behind closed doors. U.S Rep. Vern Buchanan (R-FL) plans to hold Obama to his word. His sponsored resolution, the "Sunshine Resolution" would demand that the health care bill negotiations be open and
...under the watchful eye of the American people.
Today, Buchanan declared that he would file a discharge petition that would require that the House vote on his resolution. Buchanan explained the necessity of the "Sunshine Resolution":
In Florida we have one of the strongest right-to-know laws aimed at ensuring that government leaders conduct the people’s business in public. No private meeting, no backroom deals, no secrecy. It’s time to shine some Florida Sunshine on the Halls of Congress.
According to House Minority Leader John Boehner (R-OH) the "Sunshine Resolution" is needed to ensure that Congress is accountable and open to the American people:
We’re taking this step because something as important as the Democrats’ health care bill, with its Medicare cuts and tax hikes, should not be slapped together behind closed doors. Secret deliberations are a breeding ground for mischief, including sweetheart deals that end up not being discovered until it’s too late.
If a majority of the House sign Buchanan's discharge petition then the "Sunshine Resolution" will be voted on regardless of any objections. Health care reform legislation will likely be a radical change to Americans' health care. For the sake of the American people, it is imperative that these negotiations occur in public with no backroom deals.
In 2008, Obama stated that the health care debate would be open and televised: But here’s the difference: I’m going to do it all on C-SPAN so that the American people will know what’s going on. Despite Obama's promise, the health care debate has continued behind closed doors. U.S Rep.
Despite soaring unemployment rates, the Obama administration persistently alleges that the stimulus plan has been a success. According to Christina Romer, chairman of the Council of Economics Advisors:
The most important bottom line is to say that close to 2 million jobs have been created or saved by the close of 2009, a truly stunning . . . effect of the act.
In reality, the unemployment rate remains at 10 percent. Since the passing of the stimulus bill, Americans have lost 2.7 million jobs. In fact, 85,000 jobs were lost in December alone. So how can the Obama administration continue to report that nearly two million jobs have been “created or saved?” It’s simple. They have now altered the way in which “saved” jobs are calculated. The Associated Press explains their latest counting method:
Despite mounting a vigorous defense of its earlier count of more than 640,000 jobs credited to the stimulus, even after numerous errors were identified, the Obama administration now is making it easier to give the stimulus credit for hiring. It's no longer about counting a job as saved or created; now it's a matter of counting jobs funded by the stimulus.
This new method of counting “saved” jobs will likely lead to even more misleading and inflated statistics from the White House. The White House's deceptive statistics now include any person that has received stimulus funding, regardless if their job would have been lost without the stimulus money. The Associated Press’ article adds:
That means that any stimulus money used to cover payroll will be included in the jobs credited to the program, including pay raises for existing employees and pay for people who never were in jeopardy of losing their positions.
It is time for the White House to stop reporting fallacious statistics. The Obama administration should be sincere with the American people. The truth is that Obama's stimulus plan is failing to live up to his expectations of saving or creating 3.5 million jobs by the end of 2010. If the current trend continues, we may lose 3.5 million jobs by the conclusion of 2010.
Despite soaring unemployment rates, the Obama administration persistently alleges that the stimulus plan has been a success. According to Christina Romer, chairman of the Council of Economics Advisors: The most important bottom line is to say that close to 2 million jobs have been created or saved by the close of 2009, a truly stunning . . .
On Friday, President Obama boldly stated that his stimulus bill has been:
…a major force in breaking the trajectory of this recession and stimulating growth and hiring.
The enormous $787 billion stimulus plan was signed in February 2009. The New York Times reported that in February:
The unemployment rate surged to 8.1 percent, from 7.6 percent in January, its highest level in a quarter-century.
On a radio address in late January 2009, Obama claimed that his stimulus plan must be passed in order to reverse the increasing unemployment trends:
Experts agree that if nothing is done, the unemployment rate could reach double digits. If we do not act boldly and swiftly, a bad situation could become dramatically worse.
However, since the passing of the stimulus plan, the unemployment rate has continued to rise at an alarming rate. In October, the unemployment rate reached double digits with the stimulus plan in effect.
The following graph shows the outrageous number of job losses throughout 2009:
On Friday, the Associated Press reported:
Gripped by uncertainty over the economic recovery, employers chopped 85,000 jobs last month, and difficulty finding work helped chase more than half a million people out of the job market. The unemployment rate held steady at 10 percent. It did not creep higher only because so many people stopped looking for work and are technically not counted as unemployed.
The truth of the matter is that the stimulus plan has not provided any economic relief. Americans are still waiting on the growth and hiring that the stimulus plan was intended to boost. According to radio host Ed Morrissey:
Not only did the stimulus plan not work, the unemployment actually rose faster with the 'recovery plan' in place than the Obama administration predicted unemployment would rise without it.
Unless Obama intended to lose 2.7 million jobs and prolong the recession, his wasteful stimulus plan should be deemed a failure.
On Friday, President Obama boldly stated that his stimulus bill has been:…a major force in breaking the trajectory of this recession and stimulating growth and hiring.The enormous $787 billion stimulus plan was signed in February 2009. The New York Times reported that in February:
If it's true that states are the "laboratories of democracy" California's cap and trade experiment should be considered a failed one and the federal government ought to think twice about implementing the same economy-killing measures across the nation. The Wall Street Journal has the story of a possible ballot measure to at least temporarily repeal California's cap and trade carbon tax.
So Republican Assemblyman Dan Logue has begun collecting signatures for "The Global Warming Solutions Act," a ballot initiative that would suspend California's cap-and-trade scheme until the unemployment rate falls below 5.5%. He's aiming to get it on the November ballot.
No matter what one thinks of climate science, it makes little sense for an individual state to unilaterally impose major new tax and regulatory costs on its own industries. The impact of California's gesture on global temperatures will be infinitesimal, but the economic impact will make the state even less attractive to start or expand a business.
The law all but encourages outsourcing to Nevada, Texas, China and India. Even the liberal Sacramento Bee, which supports the law, says that policy makers should be "candid about the real costs of the transition it is contemplating. . . . Industries that are energy-intensive will move elsewhere."
Meanwhile, a new study commissioned by the Governor's Office of Small Business Advocacy estimates that the direct cost of current California regulation is $175 billion, or nearly twice the size of the state general fund budget and about $134,000 per small business each year. The Golden State already has the second most business-unfriendly regulatory climate in the nation, after New Jersey and before the cap-and-trade law.
The stakes here are huge, and not merely for California. This is the first serious effort to roll back the environmental extremism that has dominated state capitals in recent years and is now ascendant on Capitol Hill. The green lobbies and businesses that have a monetary stake in cap and trade—including big utilities that want subsidies and Silicon Valley political capitalists investing in solar and ethanol—are sure to spend heavily to stop it. They know that an electoral defeat in the greenest of states could end their national and global hopes for cap and trade.
For Californians the issue is simpler: Whether they want to continue to impose burdens that encourage employers to locate anywhere except their once prosperous state.
Be sure to note the big utilities that want subsidies. Ay, there's the rub. It's not just your typical greenies who would like industry to have stop progressing sometime in the last century, the real danger is the corporations that are looking for taxpayer funded subsidies and who will game the system to their advantage the taxpayer's disadvantage. Sign the petition to oppose this next big government bailout here.
If it's true that states are the "laboratories of democracy" California's cap and trade experiment should be considered a failed one and the federal government ought to think twice about implementing the same economy-killing measures across the nation. The Wall Street Journal has the story of a possible ballot measure to at least temporarily repeal California's cap and trade carbon tax.