In a positive development, President Obama has signed the Judicial Redress Act yesterday. The new law will enable citizens of some of our allies to sue the U.S. government for violating their personal privacy rights.
The bill passed with bipartisan support in both the U.S. House and Senate and signed by the President soon after he received the bill. The enactment of this piece of legislation was needed in order for the new U.S.-EU Privacy Shield Agreement to move forward in Europe.
The bottom line is that the enactment of the Judicial Redress Act extends some of the privacy rights our citizens have to the citizens of our allies and demonstrates that our country is serious about protecting the personal privacy of their citizens. This signals that the U.S. may be moving towards ensuring stronger digital privacy rights when it comes to matters that may affect international commerce.
Copyright 2016 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
To inform about the legal, business, privacy, cyber security, and public policy issues that confront those who utilize digital platforms.
Showing posts with label Digital Law Expert. Show all posts
Showing posts with label Digital Law Expert. Show all posts
Thursday, February 25, 2016
Friday, August 28, 2015
Google Refuses To Acknowledge The Law In Response To European Antitrust Complaint
Earlier this year, the European Commission (EC) sent a Statement of Objections (formal complaint) to Google for violating European antitrust (competition) laws. In particular, the EC alleges Google “has abused its dominant position in the markets for general internet search services in the European Economic Area (EEA) by systematically favouring its own comparison shopping product in its general search results pages. The Commission's preliminary view is that such conduct infringes EU antitrust rules because it stifles competition and harms consumers.”
Yesterday, Google responded to the EC's complaint with a 100 plus page defiant response and blog post. Interestingly, Google did not request a hearing on the matter and this tactic has provided credibility to Google's opponents' claims that if Google is confident that its legal position is correct as a matter of law it would request a hearing to defend itself. A spokesman for the EC told Bloomberg News that "[i]t's common for companies to ask for an oral hearing but it doesn't happen all the time".
In my experience, guilty parties generally hide behind written submissions and avoid direct confrontation with their accusers. According to Bloomberg News, "[h]earings can make a difference. Thirteen of the world's biggest banks succeeded at a face-to-face confrontation last year to unsettle an EU case into the credit-default swaps market...No fines have been issued in that case." Therefore, Google's refusal to face the EC in an oral hearing indicates to me that it believes it has violated European competition law.
Google's cavalier behavior over the years in regards to competition, privacy, and accepting illegal ads clearly demonstrates that it believes its above the law. Since the EC opened its antitrust investigation into Google, the company has paid hundreds of millions of dollars in fines and settlements due to illegal behavior. In each of these situations, Google has dragged its heels when it was caught intentionally misleading regulators, and/or consumers, and/or the media.
In 2011, Google paid a $500 million fine for knowingly accepting illegal advertisements from Canadian pharmacies. Subsequently, it paid multiple million dollar fines in the United States and in Europe for privacy violations in connection with its Street View data collection project, its Buzz social network, its 2012 privacy policy change, and the Safari hack incident.
Illegally abusing market position in Internet search (and/or other areas) is intertwined with data collection, usage, and privacy issues because in order to receive the most "relevant" search results to a search query a search engine must be able to access and process voluminous amounts of data very quickly. For years, 90% to 96% of Google’s revenue has come from advertising which means it is dependent upon being able to obtain massive amounts of personal information at a low cost to feed its behavioral advertising machine.
Countries have different legal criteria when determining whether a company has violated antitrust laws or if a potential merger will create an anti-competitive market. Europe has a long history in regulating anti-competitive markets. Since Roman times, the continent has regulated commerce to ensure competition and fair play. The EC is not targeting Google out of nationalistic fervor to boost EU based companies. Google is being targeted because it is clearly utilizing its dominant position to violate antitrust laws.
The EC has actively enforced its competition laws for years. Last year, a $1.44 billion dollar fine against Intel was upheld for anti-competitive behavior after at least a fiver year plus fight. In 2013, Microsoft was fined $731 million dollars for not adhering to its previous antitrust agreements. So, why does Google think they are are above the principles that have governed European markets for more than 2000 years?
My hope is that the EC utilizes all of the legal and regulatory tools at its disposal to ensure that Google and other companies that violate EC competition and privacy laws are held accountable. Internet users around the globe are harmed when companies such as Google violate antitrust laws.
Copyright 2015 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
Yesterday, Google responded to the EC's complaint with a 100 plus page defiant response and blog post. Interestingly, Google did not request a hearing on the matter and this tactic has provided credibility to Google's opponents' claims that if Google is confident that its legal position is correct as a matter of law it would request a hearing to defend itself. A spokesman for the EC told Bloomberg News that "[i]t's common for companies to ask for an oral hearing but it doesn't happen all the time".
In my experience, guilty parties generally hide behind written submissions and avoid direct confrontation with their accusers. According to Bloomberg News, "[h]earings can make a difference. Thirteen of the world's biggest banks succeeded at a face-to-face confrontation last year to unsettle an EU case into the credit-default swaps market...No fines have been issued in that case." Therefore, Google's refusal to face the EC in an oral hearing indicates to me that it believes it has violated European competition law.
Google's cavalier behavior over the years in regards to competition, privacy, and accepting illegal ads clearly demonstrates that it believes its above the law. Since the EC opened its antitrust investigation into Google, the company has paid hundreds of millions of dollars in fines and settlements due to illegal behavior. In each of these situations, Google has dragged its heels when it was caught intentionally misleading regulators, and/or consumers, and/or the media.
In 2011, Google paid a $500 million fine for knowingly accepting illegal advertisements from Canadian pharmacies. Subsequently, it paid multiple million dollar fines in the United States and in Europe for privacy violations in connection with its Street View data collection project, its Buzz social network, its 2012 privacy policy change, and the Safari hack incident.
Illegally abusing market position in Internet search (and/or other areas) is intertwined with data collection, usage, and privacy issues because in order to receive the most "relevant" search results to a search query a search engine must be able to access and process voluminous amounts of data very quickly. For years, 90% to 96% of Google’s revenue has come from advertising which means it is dependent upon being able to obtain massive amounts of personal information at a low cost to feed its behavioral advertising machine.
Countries have different legal criteria when determining whether a company has violated antitrust laws or if a potential merger will create an anti-competitive market. Europe has a long history in regulating anti-competitive markets. Since Roman times, the continent has regulated commerce to ensure competition and fair play. The EC is not targeting Google out of nationalistic fervor to boost EU based companies. Google is being targeted because it is clearly utilizing its dominant position to violate antitrust laws.
The EC has actively enforced its competition laws for years. Last year, a $1.44 billion dollar fine against Intel was upheld for anti-competitive behavior after at least a fiver year plus fight. In 2013, Microsoft was fined $731 million dollars for not adhering to its previous antitrust agreements. So, why does Google think they are are above the principles that have governed European markets for more than 2000 years?
My hope is that the EC utilizes all of the legal and regulatory tools at its disposal to ensure that Google and other companies that violate EC competition and privacy laws are held accountable. Internet users around the globe are harmed when companies such as Google violate antitrust laws.
Copyright 2015 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
Monday, March 23, 2015
New York Times Facebook Content Deal Is A Threat To Personal Privacy
The New York Times is one of the world's most respected news organizations and one of the most popular destinations for news on the Internet. However, I was dismayed to read in The New York Times that it may strike a deal to house some of its content inside Facebook.
This is a very troubling development for not just the media landscape but also for the freedom of thought and expression. The ramifications of this potential deal will erode the privacy of The New York Times' readers and it will enable data brokers and their clients to create richer profiles of those who read the paper via Facebook due to Facebook's troubling deal with multiple data brokers.
When a New York Times reader utilizes Facebook to access articles, this information will be sent to Facebook's data broker partners who will insert this content into a user's digital dossier. This data may be utilized by banks, insurance companies, employers, etc... to discriminate against people for reading about certain topics. For example, when someone reads a lot of articles about their race, sexual orientation, health issue, religion, etc.. this data will be tracked and a data broker may provide it to one of their clients who may utilize it to decide on whether a reader is a good fit for a job.
While ad networks and other digital tracking platforms already combine every digital morsel about users they can find, being able to track users from their personal Facebook account creates a new level of data purity that from a privacy standpoint is very troubling. I don't want data brokers to be able to track everything that I read on The New York Times and combine that information with other personal characteristics about myself.
Due to Facebook's troubling privacy policy and practices, I do not utilize it for personal communications and I have no plans on doing so in the future. I urge The New York Times and others who may be thinking about hosting their content on Facebook to think about these important privacy issues before finalizing any deal that may harm their users' in unanticipated ways.
Copyright 2015 by The Law Office of Bradley S. Shear, LLC All rights reserved.
This is a very troubling development for not just the media landscape but also for the freedom of thought and expression. The ramifications of this potential deal will erode the privacy of The New York Times' readers and it will enable data brokers and their clients to create richer profiles of those who read the paper via Facebook due to Facebook's troubling deal with multiple data brokers.
When a New York Times reader utilizes Facebook to access articles, this information will be sent to Facebook's data broker partners who will insert this content into a user's digital dossier. This data may be utilized by banks, insurance companies, employers, etc... to discriminate against people for reading about certain topics. For example, when someone reads a lot of articles about their race, sexual orientation, health issue, religion, etc.. this data will be tracked and a data broker may provide it to one of their clients who may utilize it to decide on whether a reader is a good fit for a job.
While ad networks and other digital tracking platforms already combine every digital morsel about users they can find, being able to track users from their personal Facebook account creates a new level of data purity that from a privacy standpoint is very troubling. I don't want data brokers to be able to track everything that I read on The New York Times and combine that information with other personal characteristics about myself.
Due to Facebook's troubling privacy policy and practices, I do not utilize it for personal communications and I have no plans on doing so in the future. I urge The New York Times and others who may be thinking about hosting their content on Facebook to think about these important privacy issues before finalizing any deal that may harm their users' in unanticipated ways.
Copyright 2015 by The Law Office of Bradley S. Shear, LLC All rights reserved.
Thursday, March 19, 2015
WSJ: Key FTC staff wanted to sue Google after finding ‘real harm to consumers and to innovation’
The Wall Street Journal has uncovered a never before released bombshell report that "concluded in 2012 that Google Inc. used anti-competitive tactics and abused its monopoly power in ways that harmed Internet users and competitors." These revelations are very troubling and raise serious questions about Google's business practices that appear to warrant further investigation.
The unreleased 160-page report concluded that Google’s “conduct has resulted—and will result—in real harm to consumers and to innovation in the online search and advertising markets.” This internal document was apparently released due to a FOIA request and appears to have not been intended for public consumption.
According to Yelp's vice president of public policy Luther Lowe, “This document appears to show that the FTC had direct evidence from Google of intentional search bias." The FTC received testimony from some of the largest technology companies and the evidence compiled appears very troubling.
The bottom line is that the tech business is extremely cut throat and some companies may do almost anything to obtain market share and dominance. That may include "acting evil" and intentionally harming consumers and stifling innovation for corporate profit.
Copyright 2015 by Shear Law, LLC All rights reserved.
The unreleased 160-page report concluded that Google’s “conduct has resulted—and will result—in real harm to consumers and to innovation in the online search and advertising markets.” This internal document was apparently released due to a FOIA request and appears to have not been intended for public consumption.
According to Yelp's vice president of public policy Luther Lowe, “This document appears to show that the FTC had direct evidence from Google of intentional search bias." The FTC received testimony from some of the largest technology companies and the evidence compiled appears very troubling.
The bottom line is that the tech business is extremely cut throat and some companies may do almost anything to obtain market share and dominance. That may include "acting evil" and intentionally harming consumers and stifling innovation for corporate profit.
Copyright 2015 by Shear Law, LLC All rights reserved.
Tuesday, January 20, 2015
Kids Digital Privacy and Cyber Security Highlighted in State Of The Union
During President Obama's State of the Union Address this evening the importance of children's digital privacy and cyber security was highlighted. According to The White House Medium account, the President's official prepared address stated,
"No foreign nation, no hacker, should be able to shut down our networks, steal our trade secrets, or invade the privacy of American families, especially our kids. We are making sure our government integrates intelligence to combat cyber threats, just as we have done to combat terrorism. And tonight, I urge this Congress to finally pass the legislation we need to better meet the evolving threat of cyber-attacks, combat identity theft, and protect our children’s information. If we don’t act, we’ll leave our nation and our economy vulnerable. If we do, we can continue to protect the technologies that have unleashed untold opportunities for people around the globe."
Since more of our personal information is being housed in digital cloud based platforms, the President's comments are a welcome development. When the President's State of the Union Address is combined with his recent historic speech at the FTC that discussed the need for stronger student privacy laws, I am optimistic more attention will be paid to these very important issues in the near future.
Copyright 2015 by Shear Law, LLC All rights reserved.
"No foreign nation, no hacker, should be able to shut down our networks, steal our trade secrets, or invade the privacy of American families, especially our kids. We are making sure our government integrates intelligence to combat cyber threats, just as we have done to combat terrorism. And tonight, I urge this Congress to finally pass the legislation we need to better meet the evolving threat of cyber-attacks, combat identity theft, and protect our children’s information. If we don’t act, we’ll leave our nation and our economy vulnerable. If we do, we can continue to protect the technologies that have unleashed untold opportunities for people around the globe."
Since more of our personal information is being housed in digital cloud based platforms, the President's comments are a welcome development. When the President's State of the Union Address is combined with his recent historic speech at the FTC that discussed the need for stronger student privacy laws, I am optimistic more attention will be paid to these very important issues in the near future.
Copyright 2015 by Shear Law, LLC All rights reserved.
Thursday, April 3, 2014
The Student Privacy Bill of Rights
On March 6, 2014, Khaliah Barnes, the Director of the Electronic Privacy Information Center's (EPIC) Student Privacy Project authored an extremely
important article that was featured in the Washington Post titled, "Why a Student Privacy Bill of Rights is desperately needed". The piece details the digital privacy challenges students encounter and why they need to have stronger legal rights to better protect their personal privacy and safety. I wholeheartedly agree with Ms. Barnes and believe our students need more robust digital privacy protections.
The main federal laws designed to protect student privacy, the Family Educational Rights and Privacy Act (FERPA) and the Protection of Pupil Rights Amendment (PRPA) have not been updated to keep pace with the Digital Age. The lack of legal protections for our students' personal information that is stored in the cloud has made Ms. Barnes' Student Privacy Bill of Rights a necessity. It enumerates six basic rights for students and I believe that in the age of Big Data, students have "certain unalienable Rights" regarding their personal privacy. The Rights are listed below:
Right #4 Security: Students have the right to secure and responsible data practices
Right #5 Transparency: Students have the right to clear and accessible information privacy and security practices.
Transparency is key to fostering successful privacy and security practices. Educational institutions and their contractors need to be required by law to be fully transparent about the type of information they collect, how it is utilized, how long it is archived, and who has access to it. School vendors such as Google who have not been transparent about their privacy and security practices put our students' privacy and personal security at risk. If schools are unable to provide clear and accessible information about their contractors' privacy and security practices, students should have the right to opt-out of participating in a school provided platform that harms their privacy and puts their personal security at risk.
Right #6 Accountability: Students should have the right to hold schools and private companies handling student data accountable for adhering to the Student Privacy Bill of Rights.
The main federal laws designed to protect student privacy, the Family Educational Rights and Privacy Act (FERPA) and the Protection of Pupil Rights Amendment (PRPA) have not been updated to keep pace with the Digital Age. The lack of legal protections for our students' personal information that is stored in the cloud has made Ms. Barnes' Student Privacy Bill of Rights a necessity. It enumerates six basic rights for students and I believe that in the age of Big Data, students have "certain unalienable Rights" regarding their personal privacy. The Rights are listed below:
Right
#1 Access and Amendment: Students have the right to access and amend
their erroneous, misleading, or otherwise inappropriate records, regardless of
who collects or maintains the information.
While growing up in the 1980's, I didn't have to worry that everything I said to my classmates and/or teachers would be on my permanent record forever. When I attended elementary, middle, and high school, the primary form of communication was in person, on the phone, and handwritten/typed letters. In college, I recall sending out my first email and then in law school email began to gain traction.
While growing up in the 1980's, I didn't have to worry that everything I said to my classmates and/or teachers would be on my permanent record forever. When I attended elementary, middle, and high school, the primary form of communication was in person, on the phone, and handwritten/typed letters. In college, I recall sending out my first email and then in law school email began to gain traction.
As an adjunct professor at a major
international university, I have noticed that students prefer email as their
primary form of communication outside of class.
Students sometimes make inappropriate remarks in class and/or email. However, students attend school to learn how
to communicate and I believe the content of their school work and their school related
communications should be protected and off limits from data mining. My students and children should be afforded
the same privacy protections I experienced in school without fear that every
single student-teacher and student-student
digital interaction may be used against them in the future.
Right
#2 Focused collection: Students have the right to reasonably limit
student data that companies and schools collect and retain.
Schools, along with their vendors,
and sub-contractors should be limited to what type of data they are able to
collect and retain about students. For
example, some schools require student-athletes to install cyber-monitoring
software onto their personal computers and personal digital media accounts so all
of their online postings may be captured and archived indefinitely. One school vendor was caught a couple years
ago by Time Magazine
abusing its access to personal student data and utilizing their content for advertising
purposes. Therefore, it is imperative
that students have the right to reasonably limit the type of personal
information that is collected and retained about them by companies that
contract with schools.
Right
#3 Respect for Context: Students have the right to expect that companies
and schools will collect, use, and disclose student information solely in ways
that are compatible with the context in which students provide data.
Unfortunately, some companies have
not been honest about the manner in which they collect and utilize personal student
information. Education Week
recently reported that Google is abusing its privilege as a school learning platform
provider because it is using its Apps For Education offering to surreptitiously
data mine student emails for potential advertising.
Whether its through cloud computing,
mobile communication devices, apps, or old school personal computer networks, a
tremendous amount of information is being collected by third parties and this
data is not under the direct control of our schools. Therefore, schools and their vendors must be
required to disclose exactly what is happening to student information that is
stored digitally.
Right #4 Security: Students have the right to secure and responsible data practices
Secure data practices do not happen
overnight and requires cooperation from both schools and their vendors. Professor Dan Solove of George Washington
University has been advocating for years that schools hire chief privacy
officers to educate and provide leadership on these issues. Earlier this year, Prof. Solove told USA Today, “[w]ithout
a privacy officer in schools, there will be no one looking out for privacy
issues,”
Recent high profile data breaches at the University of Maryland
and Indiana University
demonstrates the need for educational institutions to implement policies and
practices that better protect our students' privacy.
Right #5 Transparency: Students have the right to clear and accessible information privacy and security practices.
Transparency is key to fostering successful privacy and security practices. Educational institutions and their contractors need to be required by law to be fully transparent about the type of information they collect, how it is utilized, how long it is archived, and who has access to it. School vendors such as Google who have not been transparent about their privacy and security practices put our students' privacy and personal security at risk. If schools are unable to provide clear and accessible information about their contractors' privacy and security practices, students should have the right to opt-out of participating in a school provided platform that harms their privacy and puts their personal security at risk.
Right #6 Accountability: Students should have the right to hold schools and private companies handling student data accountable for adhering to the Student Privacy Bill of Rights.
FERPA has no private right of action against school vendors. This is a huge
loophole that puts the burden of protecting our children's privacy squarely on
academic institutions even though many schools are ill equipped and
under-funded to do so. New state and/or
federal laws/regulations are needed to hold school contractors accountable for
violating the privacy of our students.
A
recently released report on Big Data and "alternative
credit scoring" by the World Privacy Forum
reinforces the need for greater regulation to protect our privacy. The report discusses unfairness and
discrimination issues that may soon become widespread because our current legal
and regulatory privacy framework was designed before email, apps, and the cloud
became ubiquitous. Students shouldn't
have to worry about whether their school related research, questions, communications, and/or
projects on disabilities, HIV, personal sexuality, pregnancy, sexually transmitted diseases, etc... will be data mined
and/or sold to the highest bidder.
If third party vendors mislead schools,
parents, or students about their data handling or protection practices, they need
to be held legally and financially responsible for privacy violations.
For example, students who utilize Google Apps For Education through their
schools should be able to hold Google legally and financially accountable for data mining their school digital interactions, content, work etc...for non-educational purposes.
Soon after the Education Week article that uncovered Google's very troubling student data mining practices was published, I reached
out to Ms. Barnes and asked her to comment about these new revelations. In an email Ms. Barnes stated, "Google's data
mining admissions underscore the importance of the Student Privacy Bill of
Rights. Here's a situation where students lost total control over their
information. The students first lost control when the schools made a choice on
behalf of students, without first adequately vetting Google's data practices
and ensuring that those practices don't put students at risk. Second, students
lost control when Google decided to read students' emails. Google's practices
contravene the Student Privacy Bill of Rights by repurposing student data for
commercial use. Google should be held accountable to students, the Education
Department, and the Federal Trade Commission for violating student trust."
As
a society, we need to do more to protect our children's privacy in the Digital
Age. A first step would be to adopt the
principles advocated by Ms. Barnes' in her Student Privacy Bill of Rights.
Copyright 2014 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
Saturday, September 28, 2013
Will the European Union Ban Data Mining of Student School Content?
To lower costs and increase efficiencies a growing number of educational institutions are transitioning from utilizing internal servers to external cloud based services. Well known technology companies such as Amazon,
Google, HP, IBM, Microsoft, and Oracle are
competing to become the go-to cloud service provider for schools.
Milton
Friedman, a famous economist, popularized the phrase, "there ain't no such thing
as a free lunch". In other words, one always has to pay for a
good or service, whether by exchanging money or giving up something of value. During the past decade, a growing number of digital companies have
adopted a model where they offer their services for free in the hope that their
platform gains widespread acceptance. In
return, those utilizing these services pay for the service by giving up their
personal privacy by accepting agreements that enable service providers to
monetize their personal information.
Education
budgets in some European member states have been slashed during the past
several years due to the economic downturn.
Some cloud computing providers appear to be capitalizing on these deep budget cuts as part of
their pitch to governments and educational institutions. Unfortunately, some digital service
providers do not have the best intentions because strong privacy protections are not built
into the design of some of their platforms.
These companies may require schools to execute agreements that do not properly protect the personal data of students. For example, Sweden's data protection authority recently ordered a school district to stop utilizing Google Apps for Education because the service contract didn't comply with Sweden's Data Protection Act. In other words, Google's agreement with a municipality in Stockholm did not provide the proper safeguards to protect student data.
These companies may require schools to execute agreements that do not properly protect the personal data of students. For example, Sweden's data protection authority recently ordered a school district to stop utilizing Google Apps for Education because the service contract didn't comply with Sweden's Data Protection Act. In other words, Google's agreement with a municipality in Stockholm did not provide the proper safeguards to protect student data.
The
model UK Google Apps For
Education Agreement,
states, "Customer agrees that Google may serve advertisements (“Ads“)
in connection with the Service to End Users who are not designated by Customer
as enrolled students." Does this
clause mean that teachers, administrators, and almuni are served ads? Since students most likely are utilizing
school provided email to communicate with their teachers and teachers may discuss student matters with administrators via email are teacher-student
and administrator-student, and teacher-administrator emails data mined and monetized by Google?
Another troubling agreement clause states, "Customer agrees that any
revenue generated by Google from the Ads or otherwise derived by Google from
the Services will be retained by Google and will not be subject to any revenue
sharing." Does this indicate
that in addition to serving ads based upon teacher-student/administrator-student/teacher-administrator digital interactions, the information contained in these emails may be
monetized in other forms not necessarily mentioned in the agreement?
SafeGov.org recently
released a report about cloud computing and student privacy. The organization conducted "in-depth
interviews with over a dozen
representatives of European Data Protection Authorities (DPAs) as well
as a number of European Commission officials involved in the development of
data protection policy." Their
report found, "wide support for the idea that vulnerable data subjects
such as school children deserve special protection."
SafeGov.org's findings stated that some cloud providers may be offering schools services that
were initially built for the consumer behavioral advertising market and that
these services do not appear to have privacy by design built into their architecture. According to SafeGov.org,
"advertising-oriented cloud services may jeopardize the privacy of data
subjects in schools, even when ad-serving is nominally disabled."
Some
major threats to student privacy noted in SafeGov.org's report include:
Lack of privacy
policies suitable for schools: "[C]loud providers may
deliberately or inadvertently force schools to accept policies or terms of
services that authorize user profiling and online behavioral advertising."
Potential for commercial
data mining:
"When school cloud services derive from ad-supported consumer services
that rely on powerful user profiling and tracking algorithms, it may be
technically difficult for the cloud provider to turn off these functions even
when ads are not being served."
User interfaces
that don't separate ad-free and ad-based services: "By
failing to create interfaces that distinguish clearly between ad-based and
ad-free services, cloud providers may lure school children into moving
unwittingly from ad-free services intended for school use (such as email or
online collaboration) to consumer ad-driven services that engage in highly
intrusive processing of personal information (such as online video, social
networking or even basic search)."
Contracts that
don't guarantee ad-free services:
"By using ambiguously worded contracts and including the option to
serve ads in their services, some cloud providers leave the door open to future
imposition of online advertising as a condition for allowing schools to
continue receiving cloud services for free."
SafeGov.org's
findings are very troubling and demonstrate the need for regulators and
lawmakers in the EU to be proactive to protect the personal privacy of our next
generation of leaders. While this report
was based upon research performed in the EU, it would not surprise me if regulators
and lawmakers around the world have similar thoughts and ideas regarding the
need to protect vulnerable groups such as students and children from behavioral advertising. Shouldn't all students and children, regardless of their
geographic location, be afforded the same privacy protections?
Copyright 2013 by the Law Office of Bradley S. Shear, LLC All rights reserved.
Tuesday, July 17, 2012
Is trademark and copyright law worthless in the social media age?
The Digital Millenium Copyright Act (DMCA) was signed into law in 1998 to protect the intellectual property rights of content creators while also providing a safe harbor for internet service providers and websites who act in good faith to remove infringing content once they become aware of the matter. The Lanham Act was enacted in 1946 and has been amended several times. In 1999, the Anticybersquatting Consumer Protection Act amended the Lanham Act to address domain name trademark issues.
Does the DMCA or the Lanham Act still work in the Social Media Age? On June 16, 2010, I blogged that intellectual property protection is useless in the social media age. Since that post, very little has been done to better protect content creators from the illegal use of their intellectual property without their permission or compensation. Congress has not been able to draft compromise intellectual property legislation that better protects digital intellectual property rights while also creating a fair and equitable system to protect innocent internet service providers and websites from liability.
According to a recent San Francisco Chronicle article, Facebook appears to be a haven for the sale of counterfeit goods. Ironically, the article mentions that Facebook has been notified about this issue but it appears they will not do anything about it unless the trademark holder personally contacts them. Does this response demonstrate that Facebook has a huge problem with ads for counterfeit goods on its platform? Under Viacom v. YouTube's latest appellate court ruling, will Facebook soon have significant legal liability issues to address?
The bottom line is that that it takes time for the law to catch up with technology.
Copyright 2012 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
Does the DMCA or the Lanham Act still work in the Social Media Age? On June 16, 2010, I blogged that intellectual property protection is useless in the social media age. Since that post, very little has been done to better protect content creators from the illegal use of their intellectual property without their permission or compensation. Congress has not been able to draft compromise intellectual property legislation that better protects digital intellectual property rights while also creating a fair and equitable system to protect innocent internet service providers and websites from liability.
According to a recent San Francisco Chronicle article, Facebook appears to be a haven for the sale of counterfeit goods. Ironically, the article mentions that Facebook has been notified about this issue but it appears they will not do anything about it unless the trademark holder personally contacts them. Does this response demonstrate that Facebook has a huge problem with ads for counterfeit goods on its platform? Under Viacom v. YouTube's latest appellate court ruling, will Facebook soon have significant legal liability issues to address?
The bottom line is that that it takes time for the law to catch up with technology.
To learn more about these issues you may contact me at http://shearlaw.com/attorney_profile.
Subscribe to:
Posts (Atom)