To inform about the legal, business, privacy, cyber security, and public policy issues that confront those who utilize digital platforms.
Monday, February 25, 2013
Massachusetts Bill To Ban Data-Mining of Student Emails
Massachusetts has become the first state to introduce legislation that would ban companies that provide cloud computing services from processing student data for commercial purposes. MA Bill 331 is sponsored by Rep. Carlo Basile and it was referred to the House Committee on Education on January 22, 2013.
MA Bill 331 states, "Section 1. Notwithstanding any general or special law to the contrary any person who provides a cloud computing service to an educational institution operating within the State shall process data of a student enrolled in kindergarten through twelfth grade for the sole purpose of providing the cloud computing service to the educational institution and shall not process such data for any commercial purpose, including but not limited to advertising purposes that benefit the cloud computing service provider."
The bill may be interpreted to mean that firms who offer cloud computing services to Massachusetts academic institutions that enroll kindergarten through twelfth graders may not utilize the information contained in student emails for monetary gain. If this legislation is enacted, cloud service providers may not serve ads to students on school provided digital accounts based upon a student's digitally expressed thoughts or ideas.
Internet advertisers monetize the thoughts and/or ideas of its users via behavioral advertising. Digital behavioral advertising may occur when an email service provider scans the content of an email and then serves the user ads based upon the information it processes. For example, if a student emailed his health or sex education teacher to ask about sexually transmitted diseases or teen pregnancy, MA Bill 331 would ban a cloud computing service provider from serving ads for condoms or other related products or services to the student's school owned digital account.
According to a statement from the American Academy of Pediatrics, "young people are cognitively and psychologically defenseless against advertising." Therefore, would it be acceptable if a teacher was paid to review student class work, noted student preferences, and then returned graded assignments with offers for discounted merchandise based upon a student's home work or in class assignments?
Since it would be a breach of the National Education Association's Code of Ethics if a teacher utilizes personal knowledge obtained from his students for private advantage, shouldn't it also be a breach of the Code of Ethics if a cloud computing service provider utilizes an algorithm to do the same digitally? Because it is not acceptable if teachers offered discounts based upon student preferences gleaned from school work it should also not be acceptable if a computer algorithm processed the same information digitally and then served ads based upon the same data.
While MA Bill 331 is a good start, it should be amended to cover post-secondary students because Massachusetts is home to tens of thousands of college students and some of the most prestigious academic institutions in the world. Shouldn't students in college and graduate school also have their student-teacher interactions protected from being utilized for commercial purposes?
In general, Google's Apps For Education standard agreement provides schools the ability to serve ads to its students. The agreements generally state that all advertising revenue generated will be retained by Google so at this point it appears that schools do not have an economic incentive to turn on the behavioral advertising function. However, what will stop Google from approaching schools and stating that in order to continue receiving Google Apps for Education for free the advertising function must be enabled? Should graded school assignments and personal student-teacher interactions be utilized to serve ads to students in order to pay for educational software?
Educational software is expensive and because of the terrible recession that our country has experienced many states have seen steep cuts in education funding. While Massachusetts public schools have not yet experienced the same type of funding cuts that have beleaguered many other states what will happen when Massachusetts decides it must recalibrate how it dedicates its resources and K-12 schools are negatively affected by this change?
Tens of thousands of kindergarten through twelfth grade students in Massachusetts may already be at risk of having their school work data mined for advertising purposes. For example, students who attend Burlington Public Schools and Plymouth Public Schools in Massachusetts utilize Google Apps For Education. If students at these schools use their school provided Gmail based accounts after they graduate or link their personal YouTube or Google Plus account to their school sanctioned Gmail account their student-teacher interactions and class work may be monetized by Google and/or its advertising partners. However, if MA Bill 331 is enacted it may stop third parties from being able to monetize the digital thoughts and ideas of Massachusetts students and better protect their privacy and security.
96 percent of Google's $37.9 billion in 2011 revenue was earned from advertising. Is Google providing schools free access to its Google Apps For Education software in the hopes that it will eventually earn advertising revenue from data mining our children's digital school assignments and education-related interactions? Absent state and/or federal laws that ban the data mining of our children's class work on school provided digital accounts companies that offer educational cloud computing services to our schools may utilize our kid's personal private data for commercial gain.
To learn more about these issues you may contact me at http://shearlaw.com/attorney_profile.
Copyright 2013 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
Monday, February 18, 2013
Right To Privacy Will Be Protected By The Social Networking Online Protection Act
The Social Networking Online Protection Act (SNOPA) was recently reintroduced by Congressman Eliot Engel of New York. SNOPA is the first bipartisan federal legislation designed to protect the digital privacy of employees, job applicants, students, and student applicants in the Social Media Age. The legislation may also provide a legal liability shield to businesses and academic institutions that may make it difficult for litigants to claim that these entities have a legal duty to monitor the personal digital accounts of their employees and/or students.
The right to digital privacy needs to be statutorily strengthened in the United States. Last year, the Supreme Court in U.S. v. Jones ruled that the government needs a warrant in order to place a GPS device onto a suspect's car. The Jones' decision demonstrates that the judiciary recognizes that people still have an expectation of privacy in the Social Media Age.
At this point, there have been only a handful of publicized examples where employees have alleged that their employer and/or a company with whom they interviewed with requested access to their personal digital accounts. This may be an underreported problem because according to a 2012 Harris Interactive Survey, 37% of hiring managers utilize social networking sites to screen candidates.
Without the protections that SNOPA provides how long will it be before it becomes commonplace for employers to require job applicants and/or employees provide access to personal password protected digital accounts as part of the employment process? In 2008, Congress enacted the Genetic Information Non-Discrimination Act (GINA) to bar employers from using genetic information when making employment decisions. GINA was not enacted because of a high profile incident where an employer required a candidate to submit his genetic information as part of the application process; it was enacted as a pre-emptive measure. In contrast, there are already multiple verifiable situations where employers are requiring job applicants provide their personal digital credentials as part of the application process.
While there have only been a handful of publicized incidents where employers are requiring access to their candidates' personal password protected digital content, thousands of students across the country are being required to turn over their digital usernames and/or passwords and/or Facebook Friend a school administrator and/or install cyberstalking software in order to attend a public school, keep a scholarship or participate in extra-curricular activities.
There have been multiple incidents where public school students have been forced without reasonable suspicion to turn over their personal Facebook and/or email usernames and passwords to school administrators. Universities across the country are requiring student-athletes to register their social media user names and/or Facebook Friend school officials and/or install cyberstalking software to track and archive their personal digital activity.
With access comes responsibility. Last year, a former Library of Congress employee alleged in a lawsuit that because his former supervisor viewed one of the groups he liked on Facebook he was discriminated against. The family of Yardley Love, a University of Virginia (UVA) student-athlete who was murdered on UVA's campus by her former boyfriend George Huguely (also a UVA student-athlete), is suing UVA and school employees for $30 million dollars for failing to properly protect their daughter.
Love's family alleges that UVA and its employees knew or should have known Huguely was a danger to Love because Huguely was not properly disciplined for past known inappropriate conduct because he was a star student-athlete. While it is too soon to speculate what type of evidence Love's family will introduce during legal proceedings, if UVA and/or its employees had access to Huguely's or Love's personal digital accounts and missed and/or intentionally ignored content that may have indicated a potential problem this may create tremendous legal liability for UVA and/or its employees.
If SNOPA is enacted students will not have to worry about being required to provide access to their personal digital accounts in order to attend the school of their dreams or keep their scholarships. In addition, academic institutions that do not violate the law may have a strong legal liability shield against litigants who claim schools have a legal duty to become the social media police.
Protecting personal digital privacy will help grow the economy and foment new technological breakthroughs. If people believe their personal password protected digital thoughts, ideas, and creations are statutorily protected they will increase their usage of Dropbox, Microsoft SkyDrive, Google Plus, Facebook, etc... It is vital for our country's competitive future to implement public policy that encourages increased digital platform participation in our increasingly interconnected world.
SNOPA would encourage widespread consumer adoption of cloud based platforms because users will not have to worry that their employer or school may require they provide access to their personal password protected digital accounts absent a judicial order. SNOPA is bipartisan win-win legislation that protects employers, employees, job applicants, schools, students, and student-applicants.
To learn more about these issues you may contact me at http://shearlaw.com/attorney_profile.
Copyright 2013 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
(Full Disclosure: I am working with Congressman Engel's office on this bill.)
Wednesday, February 6, 2013
U.S. Social Networking Online Protection Act Reintroduced
The Social Networking Online Protection Act (SNOPA) was reintroduced today by Congressman Elliot Engel of New York. The bill would ban employers and
schools from being able to request or require that employees, job applicants, students, or student applicants provide access to personal password protected digital accounts.
The bill is a win for businesses, schools, employees, job applicants, student applicants, students, and the right to privacy.
With access comes responsibility. Without access it would be very difficult for an employer or school to be held legally liable for the digital content that an employee or student posts on their personal digital accounts. Therefore, the bill may protect businesses, schools, and taxpayers from tremendous legal liability.
This bill is needed because some companies are approaching employers and schools with the pitch: require your employees and/or students to verify their digital media credentials so we can scan everything they have said online, everything said about them online, and everything their digital connections discuss online. In general, nobody should be required to verify their personal digital credentials/activities/content absent a legal proceeding that requires it. More information will be forthcoming.
To learn more about these issues you may contact me at http://shearlaw.com/attorney_profile.
Copyright 2013 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
(Full Disclosure: I am working with Congressman Engel's office on this bill.)
With access comes responsibility. Without access it would be very difficult for an employer or school to be held legally liable for the digital content that an employee or student posts on their personal digital accounts. Therefore, the bill may protect businesses, schools, and taxpayers from tremendous legal liability.
This bill is needed because some companies are approaching employers and schools with the pitch: require your employees and/or students to verify their digital media credentials so we can scan everything they have said online, everything said about them online, and everything their digital connections discuss online. In general, nobody should be required to verify their personal digital credentials/activities/content absent a legal proceeding that requires it. More information will be forthcoming.
To learn more about these issues you may contact me at http://shearlaw.com/attorney_profile.
Copyright 2013 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
(Full Disclosure: I am working with Congressman Engel's office on this bill.)
Friday, February 1, 2013
FTC: More Mobile Apps Privacy Disclosures Required
The FTC recently released its “Mobile Privacy Disclosure: Building Trust Through Transparency” staff report.
The theme of the report is that mobile platform operating system
providers (Amazon, Apple, BlackBerry, Google, and Microsoft), app
developers, ad networks, and analytic companies need to provide
consumers with timely, easy-to-understand disclosures about the data
that is collected about them and how the data is utilized.
It appears to build on the September 2012 report “Marketing Your Mobile App: Get it Right From the Start”. Some of the recommendations in the September 2012 report include: build privacy considerations in from the start, honor your privacy promises, collect sensitive information only with consent, and keep user data secure.
Some members of the app ecosystem appear to have taken the FTC’s September 2012 report very seriously and anticipated that the FTC would soon crack down on companies that may not be following the FTC’s prior digital privacy recommendations. Before the FTC’s new Mobile Privacy Disclosure staff report was released, Apple, Facebook, and Microsoft teamed up to create a new initiative to educate app developers about digital privacy. The program is called ACT 4 Apps and it plans to create an environment where app developers may interact with privacy experts to learn how to abide by state and federal privacy laws.
The announcement that the FTC has fined social networking app Path $800,000 for alleged privacy violations along with this new staff report continues to demonstrates that the FTC is spending considerable resources on digital privacy issues. When the FTC announced last August that Google agreed to pay a $22.5 million dollar fine for misrepresenting to users of Apple’s Safari Internet browser that it would not place tracking “cookies” or serve targeted ads to those users that should have been a wake up call to the digital industry that their business practices may be more heavily scrutinized. December’s announcement that the FTC adopted final amendments to the Children’s Online Privacy Protection Rule (COPPA) to strengthen kids’ privacy protections should have been recognized as a signal by the digital industry that it must become more proactive in protecting the personal data of its users.
This newly released staff report recommends that mobile platforms should: provide just-in-time disclosures to consumers and obtain their affirmative express consent before allowing apps to access sensitive content like geolocation; consider providing just-in-time disclosures and obtain affirmative express consent for other content that consumers would find sensitive in many contexts; consider developing a one-stop “dashboard” approach to allow consumers to review the types of content accessed by the apps they have downloaded; consider developing icons to depict the transmission of user data; promote app developer best practices; consider providing consumers with clear disclosures about the extent to which platforms review apps prior to making them available for download in the app stores, and conduct compliance checks after the apps have been placed in the app stores; and consider offering a Do Not Track (DNT) mechanism for mobile phone users.
App developers should: have a privacy policy and make sure it is easily accessible; provide just-in-time disclosures and obtain affirmative express consent before collecting and sharing sensitive information; improve coordination and communication with ad networks and other third parties that provide services for apps so the app developers can better understand the software they are using and, in turn, provide accurate disclosures to consumers; and consider participating in self-regulatory programs, trade associations, and industry organizations.
This staff report states that advertising networks and other third parties should: communicate with app developers so that the developers can provide truthful disclosures to consumers; and work with platforms to ensure effective implementation of DNT for mobile platforms.
The overall theme of this staff report is that the mobile apps industry must do a better job of communicating to its users what data is being collected and how it is being utilized. If mobile apps stakeholders do not move in a timely manner to implement the recommendations in this report more regulation may be required to protect the personal privacy of consumers. The bottom line is that the FTC may closely monitor how stakeholders react to its recommendations to determine if more regulation may be required to protect the digital privacy of users.
While mobile apps offer some great benefits and exciting new ways to interact with others, there are tremendous privacy issues that need to be addressed. Mobile ecosystem gatekeepers and app developers need to work with regulators and lawmakers to protect the personal privacy of mobile app users and to ensure that the industry does not become over-regulated.
To learn more about these issues you may contact me at http://shearlaw.com.
Copyright 2013 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
It appears to build on the September 2012 report “Marketing Your Mobile App: Get it Right From the Start”. Some of the recommendations in the September 2012 report include: build privacy considerations in from the start, honor your privacy promises, collect sensitive information only with consent, and keep user data secure.
Some members of the app ecosystem appear to have taken the FTC’s September 2012 report very seriously and anticipated that the FTC would soon crack down on companies that may not be following the FTC’s prior digital privacy recommendations. Before the FTC’s new Mobile Privacy Disclosure staff report was released, Apple, Facebook, and Microsoft teamed up to create a new initiative to educate app developers about digital privacy. The program is called ACT 4 Apps and it plans to create an environment where app developers may interact with privacy experts to learn how to abide by state and federal privacy laws.
The announcement that the FTC has fined social networking app Path $800,000 for alleged privacy violations along with this new staff report continues to demonstrates that the FTC is spending considerable resources on digital privacy issues. When the FTC announced last August that Google agreed to pay a $22.5 million dollar fine for misrepresenting to users of Apple’s Safari Internet browser that it would not place tracking “cookies” or serve targeted ads to those users that should have been a wake up call to the digital industry that their business practices may be more heavily scrutinized. December’s announcement that the FTC adopted final amendments to the Children’s Online Privacy Protection Rule (COPPA) to strengthen kids’ privacy protections should have been recognized as a signal by the digital industry that it must become more proactive in protecting the personal data of its users.
This newly released staff report recommends that mobile platforms should: provide just-in-time disclosures to consumers and obtain their affirmative express consent before allowing apps to access sensitive content like geolocation; consider providing just-in-time disclosures and obtain affirmative express consent for other content that consumers would find sensitive in many contexts; consider developing a one-stop “dashboard” approach to allow consumers to review the types of content accessed by the apps they have downloaded; consider developing icons to depict the transmission of user data; promote app developer best practices; consider providing consumers with clear disclosures about the extent to which platforms review apps prior to making them available for download in the app stores, and conduct compliance checks after the apps have been placed in the app stores; and consider offering a Do Not Track (DNT) mechanism for mobile phone users.
App developers should: have a privacy policy and make sure it is easily accessible; provide just-in-time disclosures and obtain affirmative express consent before collecting and sharing sensitive information; improve coordination and communication with ad networks and other third parties that provide services for apps so the app developers can better understand the software they are using and, in turn, provide accurate disclosures to consumers; and consider participating in self-regulatory programs, trade associations, and industry organizations.
This staff report states that advertising networks and other third parties should: communicate with app developers so that the developers can provide truthful disclosures to consumers; and work with platforms to ensure effective implementation of DNT for mobile platforms.
The overall theme of this staff report is that the mobile apps industry must do a better job of communicating to its users what data is being collected and how it is being utilized. If mobile apps stakeholders do not move in a timely manner to implement the recommendations in this report more regulation may be required to protect the personal privacy of consumers. The bottom line is that the FTC may closely monitor how stakeholders react to its recommendations to determine if more regulation may be required to protect the digital privacy of users.
While mobile apps offer some great benefits and exciting new ways to interact with others, there are tremendous privacy issues that need to be addressed. Mobile ecosystem gatekeepers and app developers need to work with regulators and lawmakers to protect the personal privacy of mobile app users and to ensure that the industry does not become over-regulated.
To learn more about these issues you may contact me at http://shearlaw.com.
Copyright 2013 by the Law Office of Bradley S. Shear, LLC. All rights reserved.
Subscribe to:
Posts (Atom)