Consumer groups applaud congressional action to improve live event ticketing marketplace

September 20, 2019

Media contact: National Consumers League – Carol McKay, carolm@nclnet.org, (412) 945-3242 or Taun Sterling, tauns@nclnet.org, (202) 207-2832

Washington, DC—Today, the National Consumers League (NCL), along with seven other leading consumer and public interest groups, sent a letter to Congressmen Bill Pascrell (D-NJ) and Chairman Frank Pallone (D-NJ) and Senator Richard Blumenthal (D-CT) to applaud the lawmakers’ leadership in fixing the opaque live event industry by reintroducing the Better Oversight of Secondary Sales and Accountability in Concert Ticketing Act of 2019 (BOSS ACT). 

The following statement is attributable to Brian Young, public policy manager at the National Consumers League: 

Unchecked consolidation in the live event industry has led to an opaque ticket marketplace that is rigged against consumers. In addition to undisclosed holdbacks designed to create a false sense of ticket scarcityconsumers are forced to grapple with a litany of fake websites which pose as legitimate box offices, and ridiculous fees that increase the cost of a ticket by an average of 27-31 percent. These outrageous fees typically prevent comparison shopping as they are often not disclosed until near the end of the purchase process. Likewise, despite the passage of legislation in 2016 which banned the use of ticketbuying BOTS, consumers have witnessed an increase of illegal ticket-buying bot usage of nearly 17 percent.  Fortunately, Congressman Bill Pascrell, Congressman Frank Pallone, and Senator Richard Blumenthal are working to bring transparency and competition back into the live event ticket marketplace. Today’s letter from 8 leading consumer advocacy groups applauds their efforts.” 

To add transparency to the live event ticketing marketplace and empower consumers to make informed purchasing decisions, the BOSS ACT would: 

  • Prevent primary and secondary ticket marketplaces from slamming consumers with hidden fees during checkout process; 
  • Prohibit scalpers from impersonating venues’ and teams’ websites to charge higher prices for less-desirable seats; 
  • Require primary ticket sellers to be honest about the number of tickets they plan on selling; and
  • Require the Federal Trade Commission (FTC) to identify ways to improve enforcement against illegal ticket-buying bots. 

To read the full letter, and learn more about the BOSS ACT, click here. 

###

About the National Consumers League

The National Consumers League, founded in 1899, is America’s pioneer consumer organization. Our mission is to protect and promote social and economic justice for consumers and workers in the United States and abroad. For more information, visit www.nclnet.org.

Developing an approach towards consumer privacy and data security

Polly Turner-Ward

By NCL Google Public Policy Fellow Pollyanna Sanderson

This blog post is the first of a series of blogs offering a consumer perspective on developing an approach towards consumer privacy and data security.

For more than 20 years, Congressional inaction on privacy and data security has coincided with increased data breaches impacting millions of consumers. In the absence of Congressional action, states and the executive branch have increasingly stepped in. A key part of the White House’s response is the National Telecommunication and Information Administration (NTIA) September Request for Comment (RFC).

While a “Request for Comment” sounds incredibly wonky, it is a key part of the process that informs the government’s approach to consumer privacy. The NTIA’s process gathers input from interested stakeholders on ways to advance consumer privacy while protecting prosperity and innovation. Stakeholder responses provide a glimpse into where consensus and disagreements lie among consumer and industry players on key issues. We have read through the comments and in this series of blogs are pleased to offer a consumer perspective.

This first blog focuses on a fundamental aspect of any proposed approach to privacy and data security: the scope. Reflecting risks of big data classification and predictive analytics, one suggestion by the Center for Digital Democracy (CDD) was to frame the issues according to data processing outputs. This would cover inferences, decisions, and other data uses that undermine individual control and privacy. However, focusing on data inputs, there was consensus among many interested stakeholders that privacy legislation must cover “personal information.”

The Center for Democracy and Technology noted that personal information is an evolving concept, the scope of which is “unsettled…as a matter of law, policy, and technology.” Various legal definitions exist at the state, federal, and international level. The Federal Trade Commission’s (FTC) 2012 definition defines it as information capable of being associated with or reasonably linked or linkable to a consumer, household, or device. Subject to certain conditions, de-identified information is excluded from this definition. To help to address privacy concerns while enabling collection and use, many stakeholders agree that regulatory relief should be provided for effective de-identification techniques. This would incentivize the development and implementation of privacy-enhancing techniques and de-identification technologies such as differential privacy and encryption. Federal law to avoid classifying covered data in a binary way as personal or non-personal. An all-or-nothing approach requiring irreversible de-identification is a difficult or impossible standard.

In an attempt to recognize that identifiability rests on a spectrum, the EU’s General Data Protection Regulation (GDPR) excludes anonymized information and introduces the concept of pseudonymized data. These concepts demand federal consideration, having been introduced to United States law via the California Consumer Protection Act (CCPA). The law should clarify how it applies to aggregated, de-identified, pseudonymous, identifiable, and identified information. To be considered de-identified data subject to lower standards, data must not be linkable to an individual, risk of re-identification must be minimal, the entity must publicly commit not to attempt to re-identify the data, and effective legal, administrative, technical, and/or contractual controls must be applied to safeguard that commitment.

While de-identified and other anonymized data may be subject to lower privacy standards, they should not be removed from protection altogether. In their NTIA comment, the CDD highlights that third-party personal data, anonymized data, and other forms of non-personal data may be used to make sensitive inferences and to develop profiles. These could be used for purposes ranging from persuading voters to targeting advertisements. However, individual privacy rights may only be exercised after inferences or profiles have been applied at the individual level. Because profiles and inferences can be made without identifiability, this aspect of corporate data practice would therefore largely escape accountability if de-identified and other anonymized data were not subject to standards of some kind.

This loophole must be closed. Personal information should be broadly defined to address risks of re-identification and to capture evolving business practices that undermine privacy. While the GDPR does not include inferred information in its definition of personal information, inspiration could be taken from the definition of personal information given by the CCPA, which includes inferred information drawn from personal information and used to create consumer profiles.

Our next blog  will explore “developing an approach for handling privacy risks and harms.” In its request for comment, the NTIA established a risk and outcome-based approach towards consumer privacy as a high-level goal for federal action. However, within industry and society, there is a lack of consensus about what constitutes a privacy risk. Stay tuned for a deep dive into the key issues that arise.

The author completed her undergraduate degree in law at Queen Mary University of London and her Master of Laws at William & Mary. She has focused her career on privacy and data security.