2025-04-09 00:00:00 - Joint Committee on Advanced Information Technology, the Internet and Cybersecurity
2025-04-09 00:00:00 - Joint Committee on Advanced Information Technology, the Internet and Cybersecurity
REP FARLEY-BOUVIER - Good afternoon. It's a popular room today. Very happy to have you all here today. This is a committee hearing. I'm a jointist committee for advanced information technology, the committee hearing. I'm on the joint committee for the advanced information technology, the Internet, and cybersecurity. There are 2 other hearings happening at the same time. So if this is not the right hearing, I invite you to vote on primary hearings to be at. We're delighted to have our first hearing of the committee. I'm always happy to be working28 with my senate co-chair,30 Senator Michael Moore, and to welcome the people who are in34 this room. And we do have a number of people who are participating virtually. We are proud in the Massachusetts legislature that we have packed the ability to testify virtually as a permanent rule of our committees. I think that is really good government, and I'm gonna just say personally, it is very good for my constituents, in the Berkshires. I'm gonna start by welcoming my House colleagues, and then the Senator will welcome his Senate colleagues. We have here Vice Chair Jim Hawkins. Thanks for being here. Tom Coakley is here. Orlando Ramos, not quite here yet. Tommy Vitolo is here. Steve Owens, Steve Bolett, Joe Moschino, Mark Lombardi, and Sean Ghesqui are, are our members.
I'd also like to take a moment to thank my staff, researcher Claire Roman, and counselor David Herbstich. My chief of staff is here helping with sign-ups, and my legislative aide, district aide are helping us with online work. So to put a carry on this together is a great deal of work and we could possibly do it without the staff. And for all of you who are testifying, if you need more information or wanna give us further information or feedback on this bill, please reach out to the staff. You may know that I do have some strong opinions about this topic. In today's increasingly connected world, our personal information has become a valuable commodity. In other words, people are buying and selling our personal data. And we are constantly generating data through our online activities, whether that's purchases, health decisions, or even our physical movements. This data tells a story about us. Our beliefs, our health conditions, our financial status, and our most private decisions.
All of that is at the time. Particularly sensitive information, like our health data, our precise location, our financial details, and our children's information, deserve heightened protection. No company should be able to buy and sell that data. I believe that it's been way too191 long that the burden to protect our data has fallen on the individual as opposed to the, of that burden all on the industry that's collecting that data. So it's not just about individual protection, but we need to have the process that we're going to do here as, is customary, in the legislature. We take our colleagues who are the lead sponsors first. You're gonna be seeing that. We will then be going back and forth with people who have236 signed up to be here in person and people who are signed up. You're gonna see240 that throughout the year. And I'm going to ask for a little extra patience from all of you. You're gonna see a lot of activity here. The house is in session today. And so house members, whether they are here to testify or they are on the committee, are need to be I think it's out of the hearing. And I also wanna say a special welcome. There are some first years in attendance here today, and this is their first year for some of them. And so that's pretty cool. So, members, if you do have to go up to vote, you'll be notified here in the steering room. Thanks to the court officer who takes such good care of us. But do come back. Do come back. There's a lot of really important, So, I am going to start by calling, our, first legislator's up.
SPEAKER4 - Oh, that's right.
SEN MOORE - She enjoys working with you so much. Thank you, representative. I first wanna thank everyone for being here today. And as the representative said, you know, we've got a lot going on, and we've314 got 2 other hearings going on. So we have some Senate colleagues who are on this committee that, not with us. But I do wanna rep just read out who is on the committee with us. As the vice chair from the senate, we have Senator Peano, we328 have Senator Feingold, we have Senator Barrett, Senator Mark, and Senator Durant. Yeah. As we start the hearing, I just wanna follow my coach here as lead and let343 you know where I'm coming from as we look at345 the legislation today and going forward. Almost a week almost weekly, we read about data breaches that compromise the data of our constituents. These breaches are made worse by the fact that companies gather larger and larger amounts of data on us. We are told that self-regulation works, only to read the next week that a company suffered a massive data breach. We get told by companies not to limit the collection and sale of location data, only to read that the FTC found that a location data broker unfairly collected sensitive data, and then read that the same broker got hacked by a Russian cyber criminal with huge amount huge amounts of personal data leaked.
We have also seen breaches in health care, federal, state, and municipal government, schools,404 and finance. These are just some examples of reality that we all know. In my opinion, self-regulation does not work, but it has to be a balance. You know, we have to work with our stakeholders to try to find that balance. It's that's not going to stifle innovation, stifle the economy, but provide the protections that our constituents in Massachusetts actually need. All of this is made worse by the actions of the federal government that have deployed the data of Americans who further what some of us feel is are unlawful and unconstitutional agenda. For example, we have just seen a deal reached between the IRS and ICE for immigration enforcement that likely violates federal policy law excuse me, federal privacy law. Also, in their efforts to appease the president, I'm also very disturbed for San Francisco Pride. A tech CEO has gone on a certain podcast to state that companies need to bring masculine aggression back and stated that he believes the workplace has become, quote, neutered.
We all know what he meant. It is no surprise to hear President Trump declare that every tech CEO wants to be his friend. We have also seen data that can be weaponized, for example, to undermine women's health and their access to reproductive health care. Congress should act to protect our residents, but we know they will not. I cannot support any legislation that leaves our520 privacy to companies with policies when these companies abandon their principles and policies to appease the current president and Elon Musk. The legislature must act to establish real data protections that address these issues and protect our constituents because it is clear that the status quo cannot and will not work. Before turning to, before turning to my back to my house chair, I also wanna thank Vice Chairman Payano. We have the data. It should be the, Shield Act Shield Location Act that was originally sent to another committee. It's consumer consumer protection, Leisinger. And with us and make sure we're trying to get to the time limits that we can establish. But other than that, let's hope we have a successful session and, you know, we'll be here to vote. So thank you. Thank you, Senator.
FARLEY-BOUVIER - I'm going to call up our first panelist, which is my colleague David Rivera, and Myra. As they're getting settled, our announcement, the first announcement, is to say that we are not taking today's testimony.
UNIDENTIFIED SPEAKER 1 - H 86 - Alright. Thank you so much, Chairs Farley-Bouvier and more and members of the committee, for the opportunity to come before you today. I'm here to speak in strong support of H633 86 and act to protect location privacy with my co-file representative, David Viera, and I'm really thrilled to have639 this partnership this session. Frequently,641 like you, I'm mindful that our federal and state laws have not kept pace with the645 rapid development of digital technology. So I'm very proud to have refiled the location shield at this session to update our statutory framework to ensure that we are reflecting the digital nature of our lives. I was thrilled to see this bill receive much legislative and external support last session. The House unanimously voted in favor of a narrower version of it, and we have seen that support continue to grow and be increasingly bipartisan this session as well, supported by our legislative colleagues with more than 90 cosponsorships. Given the current efforts of the federal administration and other states, this legislation is critical for ensuring that we continue to protect the autonomy and privacy of those in the commonwealth.
Every day, we're making decisions, engaging in commerce and leisure, and communicating across digital platforms. If each of us is to continue to be the protagonist of our 1 life, we must ensure that our laws assert and protect the right to individual privacy so that we can continue to write our narratives with autonomy. In the 20 first century, that means our laws must establish guardrails and protections on the digital data we create so that corporate interests do not collect and monetize it712 in ways that impede our agency. The location shield act takes a716 critical step in addressing this paradigm. If enacted, Massachusetts would lead the nation in ensuring that individuals' location data cannot be collected and sold. You can still use your cell phone to route your next car trip. You can still check the weather forecast, but vendors and third-party interests cannot sell your personal device location information to the highest bidder. This legislature in737 a bipartisan response spring into action in the wake of the Dobbs decision almost 3 years ago now to reaffirm the right to abortion in Massachusetts and to shield our medical providers and their patients from regressive lawsuits originating in other states, including those that have enacted vigilante justice originating in other states, including those that have enacted vigilante justice regimes that encourage persons to track and bring to prosecution women and their providers.
The location shield act is critical for ensuring that our reproductive justice law actually has meaning in a world where there765 currently is no prohibition on using a woman's767 location data to determine the medical care she is receiving. Just last771 year, a federal investigation revealed that a data brokerage company tracked775 people's interstate visits using cell phone location data to planned parenthood locations across 48 states, including right here in Massachusetts, and then provided that data to 1 of the largest anti abortion campaigns in the nation. The location shield act further holds importance for members of groups that are disproportionately vulnerable to danger and living their authentic lives. Those seeking gender affirming796 care, religious minorities attending services in mosques and synagogues,800 our judges and public safety officials, and domestic violence survivors, to name several. You will hear from806 many data privacy experts and advocates today,808 and I am very grateful for their support and collaboration on this bill. I would be pleased to work with this committee as it reviews the location shield act and request that you provide it with a favorable review. Thank you.
REP VIEIRA - Thank you, Chair Farley Bavier and Chair Moore. We just heard from my colleague here of some specific instances where this location shield act is important. But as I said on the day that we had the unanimous vote in the house in the last session, privacy is a fundamental human right, and that human right underpins our freedom of association, thought, expression, as well as freedom from discrimination. The location shield act that we're talking about now is a necessary safeguard of our right to privacy, and it's a fundamental value that we proved in the house in a limited situation around abortion and access to transgender health. So you can imagine that that issue itself divides some of our colleagues. But because this was a first step in that limited situation, Republicans and Democrats came together without regard for political ideology. I'm hopeful that this committee will not only report this out but also report the language of the protections of location shield along with comprehensive privacy, which we all, again, have a fundamental human right to. A couple of years ago, many of us went throughout our districts, and we spoke about the anniversary of the United Nations human rights declaration. In 1948, the former first lady of the United States, Eleanor Roosevelt, who chaired the committee, ingrained in article 12, the human right to privacy. And so, as I told my colleagues on the floor in the last session, this is the first step. Let's all be together in this. But let's come back in the next session and work with our Senate colleagues and work on a more comprehensive location protection. And so, together, we can all add these protections to state law and honor our commitment to human rights.
MOORE - I just want to announce that Senator Payano, the vice chair of the committee, has just arrived.
FARLEY-BOUVIER - To say we were bringing up, senator Crane, and, actually, it's coming up. I did neglect to say before that we are asking for a 3 limit 3 minute time limit. We can meet it at 10:00 on the screen to help you out with that. So we just wanted to let you know and then my staff would be, if I did it. Testimony is very welcome
SEN CREEM - SB 197 - Thank you. And I hope to abide by the 3-minute rule. So I wanna thank you, Chair Moore and Chair Fraley-Bouvier, for taking me out of turn on and really your ongoing, both of you, your ongoing support of advancing meaningful data privacy reforms in Massachusetts. And in the last session, I know that I worked with Senator Moore, who was with me all the way on this. So, thank you both for all your input. So, I'm speaking today in support of my bill, Senate 197, an act to protect safety and privacy by stopping the sale of location data, and Senate 29, an act to establish the Massachusetts Data Privacy Act. Senate Bill 197 would ban the sale of location data in Massachusetts. Our location data is constantly being collected by apps on our phones and used to track our movement, our identity, and where we spend time. Many people don't even realize that their app is doing that. And in this critical moment, and believe me, it's a very critical moment, as we actively explore how Massachusetts can act at the state level to protect our residents in this current political climate.
Shielding our location data needs to rise to the top of that list. It needs to be number 1. Our location data reveals some of the most sensitive details about us. This personal information can provide insight into our health care providers, religious affiliation, and even our sexual orientation. And I wanna say that I agree with the previous speaker who said, let this be a comprehensive bill. In a time of rising antisemitism, I don't want my data to be used either. So it's very broad here. Alarmingly, under current state and federal law, privacy data brokers are able to purchase this highly sensitive information from cell phone apps and sell it to the highest bidder. Although this issue impacts everyone, allowing the sale of location data places certain groups at heightened risk, including those seeking reproductive or gender affirming care in this Commonwealth.
After Roe versus Wade was overturned, we in the legislature swiftly passed shield laws to protect our patients and providers so people felt safe to seek out our productive reproductive health care. But those were bounty bills, and those bounty bills can have repercussions.1186 It is now time we act and, again, pass a law to fortify our SHIELD law and protect our patients and providers as well as other vulnerable populations from having this data used against them through unwarranted surveillance. We already know private companies are selling location data revealing who was traveling to and from1209 abortion clinics on the open1211 market. We already know there are bad actors out there who are actively seeking to make examples of our reproductive care providers and members of other vulnerable groups. And we know it is our responsibility as state legislators to keep our residents safe from this out-of-state interference. Notably, we would not be the first state to ban this. This legislation was passed in Maryland last year. And in light of SB 8 in Texas, with bounty hunter provisions that allow private citizens to file lawsuits against people they suspect, suspect is the word, of being involved in abortions, we must protect those seeking care in Massachusetts and our providers from those vigilant efforts to track their movements.
This is not, of course, only reproductive; safeguarding our rights to move freely without knowingly being tracked impacts all of us. The wide availability of this information has serious implications, and it demands immediate This should be our number 1 bill to protect our vulnerable population. Banning the sale of location data would deny these bad actors the opportunity to weaponize this information and protect us. I also wanna express my support for advancing my bill, which is Senate 29, an act to establish the Massachusetts Data Privacy Act, which is a comprehensive data privacy bill and includes the provisions of the location shield, and it extends additional data privacy reforms. However, due to constraints and trying to keep the 3-minute rule, I'm not gonna talk about that 1 anymore. But I really wanna thank you, and I look forward to us working together, the Senate and the house,1325 to be proud and once again to say to our residents, we hear you, we wanna protect you, and we're here to make you safe. I'm happy to answer any questions you may have. I'm happy to answer any questions you may have.
FARLEY-BOUVIER - Much, senator. I'm gonna turn to members of the committee. Do you have any questions for the senator?
MOORE - The could I'm sorry, senator. Just waiting for this. What other states have already enacted this?
CREEM - Maryland is the 1 that just did it.
MOORE- Yeah. Okay. Thank you.
CREEM - And we can't let anybody be number 1. We're always new, Marlona. What's the problem? Thank you so much.
REP VARGAS - Thank you, Madam Chair. Thank you, Mister Chairman and members of the committee. We appreciate your great attendance today for this hearing and, wanna thank you for your continued work on this issue. It's pretty clear and apparent that this is an issue that is of high priority to a lot of folks in the Commonwealth and to this country at this time. And so, just appreciate the diligence with which you each do this work. My name is Rep Vargas. I'm here with Rep Rogers. We're here to testify in support of an act establishing the Massachusetts Data Privacy Act, the MDPA for short. We appreciate all the work that this committee did last session to get a version of this bill out of committee. And we hope to continue to work with you all to make that possible again. As you know, our online activity is under constant surveillance. Sometimes knowingly and sometimes unknowingly. Some consumers know how much data they're providing and the risks associated, and others perhaps don't have the time or ability to dig into the terms, conditions, and the small print at the bottom of the screen.
Not only is our personal information collected and stored without our consent, but it is also sold as a commodity. Often without regard for the consequences it may have for the individual. Our personal information, including but not limited to our age, gender, race, and location, is collected and shared routinely. Private information to us becomes a profit generator for companies, and I'd go so far as to say that it's not only a profit generator, it is the main asset and profit for many of these corporations. Moreover, unregulated data practices allow companies to collect massive amounts of personal data that are unnecessary to create the core products or perform the core services that they are intended to provide. The needless aggregation of data puts consumers at risk of data breaches and leaks, and without a clear set of regulations around data privacy, there is little incentive for companies to change their practices. Many of these companies do great things. We're very lucky here in Massachusetts in the United States to have some incredible technology companies. And our world wouldn't be the same without them.
However, the constant aggregation of this data without consumers' knowledge and the selling of this data without consumers' knowledge pose a significant risk and threat to consumer privacy. Technology companies deserve the opportunity to profit, innovate, and grow. But it should not come at the expense of people's most sensitive information being stored and sold without consent. Effective data practices should also not come at the expense of the innovation and profitability of smaller businesses and entrepreneurs. In the current regulatory environment, large companies have the upper hand over smaller businesses that don't have the initial investments in data aggregation.
Many small businesses without their own technical capabilities rely on web and data services of great companies like Amazon and Google. However, the access that large companies have to these data points has been leveraged to gain asymmetric benefits in the marketplace, leaving small businesses at a constant disadvantage. More plainly put, big tech companies and marketplaces can initially help small businesses get their products out to a wider audience and market, but those benefits are eroded once these large corporations take advantage of data generated by these small businesses to then develop their own products that quickly absorb the small business customers. Our laws are outdated and do not reflect the significance of this issue.
An act establishing the Massachusetts Data Privacy Act corrects this problem and establishes comprehensive data privacy frameworks that will bring our laws into the 20 first century and provide consumers agency. I won't go through the entirety of the bill because I'd probably bore you, and you've got some great folks coming up behind us. But more than anything, I just wanted to leave the impression that this is 1 of the most important issues of our time. And I hope that this committee can continue to prioritize protecting consumers, particularly in an environment and small business small businesses, particularly in an environment where at the national level, consumer protection and privacy is being rolled back. This is a time for Massachusetts to step up to the plate, to catch up to other states, and to show the rest of the nation how to move forward on this issue. So, with that, I will turn it over to my colleague, who will be much more impressive than I have been. much more impressive than I have been.
REP ROGERS - I don't know about that. Well, Chair Farley Bouvier, Chair Moore, thank you. Members of the committee, thank you. Forgive me if I'm distracted. I'm mostly focused on whether I look good on TV up here. I think I actually do, if I could say that. But no. So you've heard you're gonna hear so much today, and I don't wanna go on at great length because you have a lot of folks from the public who wanna testify. But, as my colleague, representative Vargas, said, this is a big issue. I think 22 states over 20 states have acted. So, you know, I1662 think it's a ripe issue for the legislature to be looking at now. And, some of you have already heard me say this, so1670 I apologize if you have. But I kinda like it. And, it's that oh, you know, we don't run searches on Google. Google runs searches on us. And so, many companies now are gathering data at an astonishing pace. I have a compendium of information on all of us, where we go, what we eat, where we're driving, what we're doing. And by the way, as representative Vargas said, I'm sure some rep folks who represent large tech companies are here. The innovation, the dynamism of our economy, you know, we wanna support innovation, and we want companies to succeed. But that's not what's a question here, and don't ever believe that's what's a question.
We're not trying to rein in the growth or the success of these companies, but we are saying there have to be some sensible guardrails for how data is collected. So, the Massachusetts Data Protection Act regulates how companies collect, process, and transfer our personal data. It's a matter of privacy rights, and specifically, the bill will set limitations on how and when sensitive data can be collected, sold to third parties and provide individuals rights of data access, correction, deletion, as well as targeted advertising opt-out. You know, all of us have the experience of going on the web. We're looking around at different things, and then it says, well, here are the terms of service. I'm actually a lawyer and who negotiates contracts. I mean but and even for me, those terms of service are incredibly difficult. They're dense. They go on for pages. But let's face it, 99% of people never even look at them. They just wanna get to where they're going, and they just click okay.
And they've agreed to things they don't even understand. So, data minimization is a key thing at the core of this bill. In other words, if you collect data as a company, do you limit it to what the consumer is using it for? So how can that go wrong? Well, an ex an example of an unsuspecting collector of your data is your car. And then the car gets information about how many people are in the car, how fast you're going, did you brake too fast, did you and then there's evidence that it's being sold to insurance companies. Unknowingly, people don't even know this is happening. In a free country that believes in liberty and individual rights, that's preposterous.
It's unfair, and it needs to be stopped. So data minimization is just the notion that the company uses it only for the specific purpose the person is there. They don't gather all this other information and start selling it to other third parties. I mean, it's just common sense. It really is. It's not overreaching. Another key provision is data level exemption as opposed to entity level exemption, and that basically gets to under federal law like HIPAA. If a company collects, say, health information, that entire entity is exempt from our state laws on data privacy. That makes no sense. It's much more logical that we have data-level exemption specifically targeted to the data for the use that the consumer is using.
And so an example of how that can go wrong is pharmacies collect a lot of information about us, and some of it's protected by HIPAA, and they're very good about keeping it confidential. But then other data these pharmacies get about us, they sell to others. And, again, consumers don't even know this is happening. So, I also wanna note I wanna wrap up because you have a lot of folks here. But I wanna mention H 98, which is an act relative to Internet privacy rights for children that's modeled after a California law. The Eraser Bill aims to protect the privacy rights of children. So please take a look at that 1. H 96 is an act to provide accountability in the use of biometric recognition technology and comprehensive enforcement; some of the concepts in there are embodied in the data privacy bill as well. So I could go on a greater length, but I better not. Thank you for your indulgence. Thank you for listening, and thank you for all your hard work on the issue. I know you've been a great leader, particularly both of you, but the house chair, I mean, I know you've really dug into this, and so thanks for your leadership.
FARLEY-BOUVIER - [Inaudible]
VARGAS - Great question, and I think there are a couple of ways to think about it. One is on the privacy side of things, making it clear and easier for small businesses to be protected in terms of how their customers' data is being used. So that's sort of a legal protection for them. And then there's the business case for them as well. Right? They're collecting all this data on customers and then providing that data to bigger companies that then can say, well, I noticed that you're selling all of these coffee mugs. Maybe I'll just go ahead and make my own coffee mug based on the data that you've created, and I'll absorb that customer segment and completely take your customer base. And so, small businesses are heavily impacted by this. We also have some exemptions in here to ensure that small businesses aren't overly burdened by this as well. And so I think it's a really important component that often isn't talked about enough when we think about this issue.
SPEAKER8 - Thank you.
FARLEY-BOUVIER - Questions from the committee.
REP VITOLO - As with much of the legislation we consider, there's a tension between specializing something for our commonwealth and the legislation of the other 49 states. Can you speak to how, legislating on this topic, we manage that balance? How we make sure we're getting it right, the tension between state and federal visions of this. Sure.
ROGERS - Well, rep representative Vitolo, thanks for that excellent question. And that is important because I don't think we're gonna see federal action anytime soon on this issue. We've seen, if anything, an alliance between tech CEOs and the current administration in Washington. So we can't wait. And 22 other states have already acted. But as you point out, the companies will say, look. We don't want a patchwork quilt of varying regulations and laws around the country. So there's a concept that that you gets to your question, perhaps you've already heard of it, called interoperability, essentially, that we the defined terms we use, the model that we, adopt should be, in harmony with other states. I believe that the bill filed by chairwoman Farley Bouvier, I think, gets to a lot of that and makes an effort to address interoperability. So I think it's a fair point for those who are advocates for the tech companies to raise, but it what it can't be is a reason to not act. Too often, these kinds of things underlying the premise, underlying your question, the concern underlying your question, become a roadblock. Well, we best, you know, and then we don't act. So whatever concerns are being raised by our friends in the technology sector, that's fine, and we need to hear them out for sure. But it's not a reason not to act. The time to act is now. And so, but thanks for your question.
VARGAS - Thank you. If I could just expand that a little bit. You know, the Consumer Financial Protection Bureau no longer exists, for example. Right? FCC rollbacks are happening, you know, on a weekly basis at this point. So this is very, very real. Alright? And we don't have the opportunity to just wait for, you know, 4 years when maybe there's an administrate a change at the administrative level or when Congress decides to act on this. The reality is that Massachusetts residents and consumers are being hurt right now. And the fundamental question when you boil this question down is, you know, does the additional work on these companies, is that of, is it too much work for them to do in order for us to protect our constituents? And I think the answer is no. Alright. I think they can take on the opportunity to figure out how to make sure that they're complying here in Massachusetts while also complying in other states. We have to make sure we do it in a way that's reasonable. Right? But if the trade-off is a little more work in order to protect consumers, I'll take that trade all day.
SPEAKER13 - Thank you.
FARLEY-BOUVIER - Anybody else down here? Here. Senator.
MOORE - Yeah. [Inaudible]
ROGERS - Great question. Candidly, I don't have a detailed answer for you on that point. I haven't investigated or studied that. But I can say just, in speaking in broad brush strokes, that we would have heard about it in the National News or, you know, Washington Post, the New York Times, the Wall Street Journal. We'd be hearing more and more about how, oh, these laws have really made our business model unworkable. To the best of my knowledge, that's not happening. And we can draft a law with the talented folks on this committee and the talented chair. We can draft a law that protects consumers and still lets companies do what they need to do to be successful. And so, so thanks for the question, but I'm not aware of any major disruptions because of laws that have been passed so far. Okay. Now, they may impose some costs on these companies, but that's true in every industry. Right? I mean, that's, you know, we need to protect the public. So thanks.
FARLEY-BOUVIER - Any else? Thank you very much, and I appreciate it. Next, we have representative Lindsay Sabadosa. And then, on deck, we have Andrew Kingman. Welcome, representative. Nice to see you today.
REP SABADOSA - HB 99 - SB 47 - Nice to see you. Good afternoon. I'm so sure we're gonna be called up to vote the second I start speaking to see if I can get through this. I am here today to speak about H 99 and S 47, an act relative to surveillance pricing in grocery stores, which I filed with Chairman Moore. So, as we've heard today from everyone who's testified thus far, technology is moving at a rapid pace. Some people who would have written their testimony by hand a few years ago are probably now writing it by ChatGPT. So we know that things have changed. And one of the things that I think we need to be proactive in preventing is surveillance pricing and surveillance advertising. These are the 2 issues that are at the heart of this legislation, which focuses particularly on grocery stores. And you'll see, from the written particularly on grocery stores. And you'll see, from the written testimony that will be submitted and others who have written today, we have a lot of food banks and food justice organizations that are in support of this bill because they understand that the future could be very grim if we don't take action.
And I just wanna say, I'm talking about the future, but already in 02/2019, Kroger, a grocery store in2364 the United States, piloted technology with Microsoft, and2368 they're now working with a company to sell this technology countrywide. So the future may be much sooner than we think. I wanna dig into just a tiny bit about what this actually means, though. So what is surveillance pricing? So, surveillance pricing can happen in a couple of different ways. It can happen through, for example, electronic price tags, which grocery stores are planning to roll out. So that instead of those nice little slips of paper that teenagers currently put in there for you as you grocery shop, there'll be an electronic monitor that tells you what the price of the product you're buying is. Because it's electronic, it can be easily adjusted. This, when paired with surveillance technology, which is facial recognition to a great extent and is already in most of our stores, can lead to surveillance pricing.
So that means that if I go into a store, the store can adjust the price because they think, based on the information they know about me, I could perhaps pay more for it. Perhaps, Rob Hawkins goes to the store, and they think, older gentleman, he probably is going to wanna pay less for eggs than rep Sabadosa would pay. And so they can adjust those prices. This is really a way to target individual shoppers, and there are plenty of ways that this2441 can be done, including by providing,2443 for example, perhaps a coupon to rep Hawkins so that even though there's 1 price, he ends up paying less than I do because I am not offered that same coupon. So that's the first thing this legislation applies to them. The second piece is surveillance advertising. Now, I would define it as predatory. Surveillance technology can identify and analyze customer data in real time when you look at your previous purchases and other information the grocery store has about you.
And we all know the grocery store has a lot of information about us. Right? You all get the coupons at checkout. Know that they're doing this. So, depending on how sophisticated that can be, the grocery store can conceivably provide ads to you as you shop based on what you most desire and what you have previously purchased. This means even the most disciplined shopper could be enticed to buy things that they weren't really intending upon buying. So I just wanna bring this all back because we're talking about food. We're not talking about luxury goods. We're not talking about clothing. We're talking about something that is essential to our survival every single day. And most people, when they are looking for affordable food, go to the grocery store. Right? That is where we think we can buy food that is the most reasonably priced. And as rep, probably Boubier2517 knows, in rural parts of the state, that can2519 mean traveling a really, really long distance.
So our goal in trying to ban2523 these 2 features, surveillance advertising and surveillance pricing, is to ensure that when people2529 go to that grocery store, they're getting a fair price. They're not getting different prices based on their demographic data or the data the grocery store has collected, and they are able to grocery shop. I just wanna say that there have been so many questions about, survival of business. Grocery stores are not going out of business. They are very profitable industries. Meanwhile, 180.4% of Massachusetts residents are using, have reported food insecurity. And we heard just yesterday at the Ways and Means hearing that about 80,000 are going to food banks to get their food. So, when we're looking at how we weigh these things, the people of Massachusetts are currently suffering. We wanna do something proactive to prevent any future suffering to the greatest extent possible. This is a first-time file, so I appreciate the early hearing on it, and I am very happy to answer any questions and, of course, work with the committee on this.
FARLEY-BOUVIER- Thank you so much, representative. Questions from the committee? On here? Over here? Senator. Surprise. It's a great partnership.
SABADOSA - Are you going to answer as well? [Inaudible]Of course. [Inaudible]I'm sorry. Any federal?
MOORE- Federal preemption or federal law discovery decision?
SABADOSA - No. There should be none. We have flagged this. So there's nothing? There is nothing. No. We, before the outgoing administration left, did do some work vetting this legislation with them. This has been a long time in the making, and they didn't raise anything. Of course, the current administration is a different administration. So, you know, we would have to go through that process again. Their interpretation of the law seems to be a little different.
SPEAKER16 - Thank you. Yes. Thank you.
SPEAKER1 - Thank you.
SPEAKER5 - Thank you so much.
SPEAKER1 - And you did it.
SPEAKER17 - I did.
SPEAKER5 - In 3 minutes and before we got called. Thank you.
FARLEY-BOUVIER - Up next is Andrew Kingman, and then we're gonna go online for Chris Gilrayne. Sir Kingman, welcome. Okay. We're having a little tech issue here. I'm sorry.
SPEAKER7 - No We're
SPEAKER1 - gonna share. We can share.
ANDREW KINGMAN - MARINER STRATEGIES LLC - Good to go. Can folks hear me okay? Yeah. Good afternoon, Madam Chair, Mister Chair, and members of the committee.
FARLEY-BOUVIER - Bring that mic.
KINGMAN - My name is Andrew Kingman. I represent the State Privacy and Security Coalition. We're a Massachusetts-based coalition representing about 35 companies and 6 trade associations across a number of different sectors. I also have a background as a privacy compliance attorney helping businesses work through various state privacy laws. Here today to consider 4 different privacy frameworks, plus a number of sectoral bills, which begs the question, how should Massachusetts proceed and navigate this? I don't wanna sit here and tell you that there's a perfect privacy framework. In my experience, these frameworks represent a series of choices and trade-offs that legislators must make. I'd like to discuss, in particular, HB 80, which is the framework that currently covers over a hundred million US consumers, has been adopted on a bipartisan basis in Connecticut, Rhode Island, New Hampshire, and 15 other states, and is the only2724 frame currently effect in these other states other than California.
HB 80, which is the mass comprehensive Massachusetts Consumer Data Privacy Act, which is different today than the Massachusetts Consumer Privacy Act and the Massachusetts Data Privacy Act, keeps Massachusetts consumers and businesses integrated with the rest of the New England economy because it's the same framework that other states are currently implementing. It establishes heightened restrictions for reproductive health data for children data, and does so in a way that has not triggered litigation in other states, along with heightened protections for geolocation data, biometric data, etcetera. It uses a data minimization standard that currently covers or is consistent2773 with over 600,000,000 consumers globally, including the European GDPR standard, and is consistent with what exists in California.
With regard to advertising, sales of data, sensitive data, and when consent is required, it is very clear and understandable. And in my experience, clarity in a law equals compliance. The more that businesses can understand what's required of them, the easier and better it is for compliance, which is better for consumers. In contrast, a number of the other frameworks today would create, would be novel, distinct even from, which was brought up. The data provisions in some of those frameworks2823 are very concerning because companies don't know what's expected2827 of them or how to comply at this point. They also, unintentionally, clearly,2833 but unintentionaly, can have the effect of depriving marginalized populations of goods and services designed for them. And, they can make it harder for Massachusetts businesses to reach their own customers than it is for those Massachusetts businesses to reach customers, again, in New Hampshire, Connecticut, and Rhode Island. All of these issues are exacerbated by the existence of a private right of action, which HB 80 does not have. I see that I'm out of time, so I will stop here and be happy to answer any questions.
FARLEY-BOUVIER- Sure. Representative Owens. Okay.
REP OWEN- Great. Well, you ended at exactly the right time because that was my question. Because I noticed that HB 80 did not have a private right of action.
KINGMAN - It does not.
OWEN - Have you had conversations with the attorney general's office about her being the author or her office being2889 the authority for this? And can you talk a little bit more about why that is so problematic for the folks that you represent?
KINGMAN - Sure. We have not yet talked with the attorney general's office about that issue. I would note that no2908 other privacy statute, no other comprehensive privacy law in the country, has a private right of action. California does have a limited private right of action for only certain types of data breaches but not for privacy violations. What we are seeing is states with attorney general enforcement staff up their offices to have their own data privacy units or to bring on data privacy attorneys to help with that enforcement. And we're seeing that pay off with consumer education of the rights that consumers have, of the obligations that are expected of businesses, and in enforcement actions that those offices are taking. The reason that we oppose a private right of action is simply that they don't work, and we have seen them abused in other states. Studies have shown that funds recovered by class action lawsuits and private rights of action go to the trial attorneys and don't make their way to consumers, privacy laws. Again, no comprehensive data privacy laws. We've seen trial lawyers abuse the private rights of action in those states, and we don't believe it's the right solution for a state to take on. And, again, no other state has done that. That's blue states, red states. There's no partisan split there.
OWEN - You heard Senator Creem before; we do like to be first, so be first.
FARLEY-BOUVIER - All set. Is anybody else over here? On this side? Rep. Meschino? Oh,
REP MESCHINO - So thank you for your testimony. You started off by introducing the state privacy and security coalition.
KINGMAN - Yes.
MESCHINO - Who are the members? Who do
SPEAKER16 - you represent?
KINGMAN - We have 35 companies, and I probably would forget some, so I won't list them all here. No.
MESCHINO - I'd like you to give me some examples.
KINGMAN - They're on our website. You can visit our website at spsc.org, and that's the best place to find out.
MESCHINO - So you are not prepared to answer that question. Okay. So I was struck by your remark about choices and frameworks and integrating into the rest of the New England economy, and that it needs to be clear and understandable. What part of don't sell our data was not clear and understandable to you? part of don't sell our data was not clear and understandable to you?
KINGMAN - What bill specifically are you referring to? So I think, HB 78 that one of the concerns that we have there is that, the definition of sale as it is evolved through privacy work groups, and throughout its adoption across the, other states that have adopted this framework has been defined intentionally to really cover any transfer of data, not just what businesses would think of as a sale, or what consumers would think of as a sale. Right? Just I give you data, you give me money. It's really any transfer of data in exchange for any monetary or other valuable consideration.
MESCHINO - So any transaction. Okay.
KINGMAN - So virtually any transaction. HB 78 includes a definition, a term transfer, which is, I think, from a compliance standpoint, would be virtually indistinguishable from the definition of sale. But the bill attaches very different obligations for sale versus the obligations it has on transferring data, and those are the types of ambiguities, again, that I think, from a compliance standpoint, makes it very difficult, and that would be further exacerbated by a PRA.
MESCHINO - But we managed to do that with HIPAA, and it seems to me you're splitting hairs. So if you are collecting data and you're transferring it for a monetary purpose.
KINGMAN - That's correct, which is why having different obligations on the definition of transfer and the definition of sale would be very difficult because almost any transfer of data, under HB 80 or HB 78, would be considered a sale. But s,o trying to distinguish what the compliance obligations are between a sale and a transfer are very difficult.
MESCHINO - So I think you just explained that it is actually consistent, and it is actually clear. The other thing I just wanted to say was, could you talk a little bit about depriving marginalized from the marginalized from services designed for them? What is that?
KINGMAN - Sure. So the age 78 and a number of the other bills have extensive definitions of what constitutes sensitive data, and then targeted advertising is prohibited using sensitive data. And so you may have, veterans who would have, discount programs or particular products that would be meant for them. And due to the data minimization rules in these bills and the prohibition on targeted advertising3226 based on sensitive data, those types of activities may not or products or services may not get to them. Think of products or services designed for the LGBTQ community or things along those lines. If there are no if there's no ability to make them aware of products or services based on that, then, you know, we think that that's concerning. Under HP 80, they would have the opportunity to provide consent or opt out of targeted advertising to say, I don't want that to happen. So, but just an outright prohibition, we think, could have some of those unintended
MESCHINO - so it does seem that you are capable of the opt out, but I would argue it should be an opt in.
KINGMAN - Well, there is opt-in consent for sensitive data in HB 80, which we're fine with. There's just not an outright prohibition on targeted advertising using that. It's up to the consumer to decide whether they want that information to be collected at all in the first place or to be processed for that type of activity.
MESCHINO- You might wanna rethink your idea of marginalized because I'll go on the record of saying I don't want you to capture that for me, and I don't feel that I am.
KINGMAN - And we believe everybody should have that choice.
SPEAKER7 - Thank you.
FARLEY-BOUVIER - Thank you. Anybody else? Representative Vitolo?
VITOLO - Good afternoon, and thank you for being here. I want you to know that I believe in you. I am not an attorney. I have colleagues who are attorneys. I think the world of them; I know they're fully capable of reading. You're the one,1 then. Well, that may be. But I know they're fully capable of reading state laws in different states and understanding the differences. And I believe in you, and I believe you believe in me. So frankly, this idea that we can't have a patchwork, I find absurd. My job is to create a patchwork of laws in this nation. If we didn't have a patchwork of laws, we wouldn't have jobs. We would just have a US Congress. Right?
So I push back strongly on the idea that you and your colleagues are incapable of reading MGL and understanding what we expect in our commonwealth, and reading the laws of the great state of North Carolina, and figuring out what they want. I believe in you. I just wholeheartedly push back on this idea that we should have a single set of laws in this United States on issues that affect us, personally and culturally. And it just might be that the great state of Massachusetts has a different idea of privacy than even my state, that I was born and raised in, Connecticut. And frankly, I'm okay with that, and I think the people in this room are as well. I hope that you and your colleagues make lots of money figuring out the difference in these 50 sets of laws, and I wish you godspeed in doing that.
KINGMAN - Yeah. Well, I suspect we would, I, you know, I think we disagree. I think our position is that, you know, I'm a Massachusetts resident. I travel to New New Hampshire. I travel to Maine all the time. I think it makes sense to have consistent expectations. I think, as a consumer, in terms of what my rights are, I think it makes sense for businesses to have consistent expectations of what's expected of them across state lines. So understanding that, you know, it may, you know, a Massachusetts business or consumer may not mind whether this framework works with Oregon or Montana. I think it makes sense to think of New England as a regional economy as we do with many other things. New Hampshire, go ahead and advocate for a sales tax, so we have consistency. Right? This idea that I mean, there's general consistency. Right? We all use well, none of us use are using the metric system in our measurements, except for 2 liters of soda. But we are different in this idea that we should be uniform when it benefits your corporation, but not uniform in other ways, just doesn't hold water with me.
KINGMAN - Well, and I appreciate your hope for my financial success. But, you know, I think what I would say is, you know, when the privacy lawyers are getting paid, that's coming out of the business's pockets. Right? And I think you'll hear from some of the small or the Massachusetts-based businesses today about the consequences of that or their concerns with that. So thank you.
SPEAKER13 - Thank you.
FARLEY-BOUVIER- Madam chair, I just wanna get to rep Coke, quickly Mowgli, and then we're gonna come back to rep Peskino.
REP MOAKLEY - Thank you, Madam Chair, and thank you, Mister Kingman, for your presence and testimony here today. Just wanna follow up on some of representative Moschino's questions about the industry or the coalition that you represent. Sure. Do you have clients? Would it be fair to say that you have clients that have sold privacy data and location data?
FARLEY-BOUVIER - Oh, here you go.
MOAKLEY - I looked it up.
KINGMAN - Yes. Probably, that's the case, given the broad definition of sale here. Again, any transfer of data virtually is a sale.
MOAKLEY - And would you be able to estimate even if a ballpark for this committee, the portion or the number of Massachusetts residents whose data has been sold by your members?
KINGMAN - No.
MOAKLEY - And how about the amount of money, say, in 2024, that was made by your membership? '
KINGMAN - No.
MOAKLEY - Thank you, Madam Chair.
FARLEY-BOUVIER- - Okay. Sure. Rep. Moschino?
MESCHINO - Oh, well, so I as instructed, I looked it up, and it's and we actually had, Internet, so I could. It's a Virginia-based organization. Google.
KINGMAN - We were actually incorporated in Massachusetts at this point. You're probably looking at a prior year's tax return.
MESCHINO - No. I'm looking at your website. Meta, NetChoice, Google, Netflix, Comcast, I mean, there's Nike. Yes. Those are all your members.
KINGMAN - As I said, we're a multi-sector coalition. So we have retailers, payment card companies, health care companies, and tech companies.
MESCHINO - TechNet. So Verizon, Walgreens, Yahoo. So I just wanna make sure everyone understood that you represent pretty much the broad spectrum of every single place that we're trying to wrangle.
KINGMAN - Yes. And we work to try to get solutions that work on behalf of all sectors of the economy.
FARLEY-BOUVIER - Thank you. Thank you so much. Is anybody over here? I do have a couple of questions. Following up on rep Mokeley, question about how many people in Massachusetts you think their data has been sold to. And I'm gonna guess I'm gonna give an estimate, and I think it's 7,000,000. Because I think every resident of Massachusetts has had their data sold. I do have a couple of questions, kind of specific questions.. I very much appreciate your presence here. I'm sincere in that. We're gonna do a little bit of work, though. Okay? I look forward to meeting with you soon to be able to wrangle a few things out. You mentioned Connecticut, and because you work in multiple states, this is helpful to us. Again, you do this work in real time. And so I'm sure you're familiar with Senator Maroney, who's a Yes. National leader in this space, 1356. Can you talk to me about that bill, H 78, and the original Connecticut? I call it Connecticut 1. And I would like you to focus on data minimization in those bills. Sure.
KINGMAN - I think data minimization is 1 of the items that's under discussion in that state and that, you know, I don't know where that's going to end up, but I think that my understanding is that that's under discussion. I think 1 of the benefits of3707 that we've seen from Senator Maroney's legislation is3713 that it's a framework that can be built on over time and modified. So, 2 years ago, he added, protections for consumer health data, reproductive health, prevented geofencing around, reproductive health clinics, and also was, worked on legislation that added on and added protections for children, and, things like making sure that, you know, children are notified if their precise geolocation is being tracked, that products and services weren't designed to extend, significantly extend or unreasonably extend their time, on a particular platform, etcetera.3764 And then, we've seen the status of a victim of a crime added to or proposed, I think, this year to the definition of sensitive data, status as transgender, non-binary added. It. So it's a framework that has proven to be workable over time that can be tweaked and added to, I guess. I think it is the benefit of that framework that's reflected in HB 80 here.
FARLEY-BOUVIER- Okay. And then when it comes to I'm gonna focus on data minimization. Because that seems to be a big concern. Can you tell me some concerns that you have around H104 and H78 around data minimization compared to H 80?
KINGMAN - Yes. So, the data minimization provisions that are reflected in H 80 are the same data minimization standard that is in the GDPR and is consistent with California. California has a little bit of a different twist and is consistent with the 17 other states, I think, that that have adopted a similar framework. Maryland, as you know, adopted a very different framework that's reflected in H 78 as well. And I think it's not in effect yet. So, we don't have a lot of data as to how businesses are planning to comply. But I think the concerns are a couple of things. The fact is that the data that can be collected is only data to provide a specific product or service requested by a consumer.
There are concerns there around providing product recommendations based on prior purchasing history, whether a company can provide updates or security patches to an app because that's not being specifically requested. And some of the solutions that I have heard companies talk about, just at privacy conferences, talking to chief privacy officers, some of them are saying, we may just put Maryland in a different bucket and not provide those updates or not provide product recommendations for those consumers. It may be that we create many more click-throughs that consumers will have to deal with because we have to create a regulatory trail to make sure that, you know, the enforcement authorities know that the that the consumer has actually requested this product. So, that may be 1 of the strategies, but it doesn't exist anywhere else. And so it's very untested.
I think the other concern is that businesses may take different approaches to how to comply with that versus a more uniform hey. We understand what the standard is. We do it all over the world. We know how to do that. And I think the last piece is that, you know, 1 of the benefits of having the data minimization standard be tied to although, not 1 of the benefits of having it be tied to a privacy notice is that enforcement authorities can go directly to the privacy notice to say, is what you're doing consistent with what you're telling people what you're doing? And is it reasonable compared, you know, adequate, relevant, reasonably necessary? Versus if a, you know, bad actor decides that your precise geolocation data or your biometric data is strictly necessary to provide that product or service, then the consumer doesn't have the opportunity to say no. I don't want you to collect that. That's where, in our opinion, the consent mechanism in H 80 is important.
FARLEY-BOUVIER - Okay. Well, thank you. It sounds like we need to work together on data minimization, and we look forward to continuing to select everybody here. I look forward to continued conversation to be able to give specific feedback so that we can reach our shared goals together. Sure. Of protecting the people of Massachusetts. Great.
KINGMAN - Okay. Appreciate that.
FARLEY-BOUVIER - And I'm gonna let house members know that there's a roll call, so please come back if you can. And thank you, mister Kingman. I'm gonna ask for the indulgence of Chris Gilrein, who I just said4039 was on deck, but we are gonna turn to Senator Driscoll, who is here. And senator, you'll forgive me, but I'm gonna go and vote and then come back. Okay? Thank you.
SEN DRISCOLL - SB 33 - Thank you, Chair Farley Bouvier, Chair Moore, and members of the committee. I will submit written testimony as well but wanted to appear before the committee today and ask respectfully that S 33 enact establishing comprehensive Massachusetts consumer data privacy, received a favorable report from the committee. Data privacy and security are among the most pressing issues for consumers across the Commonwealth and the country. A 2023 Pew Research poll found that 81% of Americans were concerned about how companies use the data that is collected about them, with 67% reporting that they did not understand what type of data has been collected about them. And beyond that polling, we've all heard the stories. Someone searches for a product on4105 1 website, and within a few days or, in some cases, a few minutes, their social media feeds are bombarded with targeted advertisements, pushing them to buy 1 brand or another.
Increasingly, companies are gathering, processing, and ultimately selling the data of our constituents, many of whom are only only have a faint understanding of the nature, the data scope, and the depth of that data that these companies are handling. This goes beyond just advertising, of course. Often without consumer knowledge, companies are building entire personal profiles on consumers, predicting their feelings on issues or products, behaviors, and decision making with increasing accuracy. As I'm sure you've heard and are aware, there really isn't a federal standard for consumer data privacy at this point, and there needs to be needs to be 1.4155 So in that void, year after year,4157 our use of tech and reliance on the digital4159 world grows and grows. This bill, S 33, is more than a conversation starter. It contains an approach and proposal that 17 other states have coalesced around, adopted, and implemented.
This legislation aims to inform and empower consumers about their specific nature of the data that the companies are collecting and processing and provides our constituents with tools to opt out of the sale of their personal data for advertising or data profiling. This act would require companies doing business in the Commonwealth to respond to consumer requests, confirming whether the company is collecting and processing their data, obtaining copies of that data, and ultimately mandate the deletion of their personal data. Again, 17 other states, including several of our neighbors here in New England, have adopted legislation creating similar or identical safeguards, and this act builds on that critical work that was done last session in regard to the Data Shield Act and expands key consumer protections. As automation increasingly eases the workload of collecting, processing, and interpreting consumer data, consumer privacy is jeopardized in an unprecedented manner. In massive numbers, websites, apps, and other programs are tracking consumer data and building detailed data profiles.
Massachusetts led the way on consumer protections. In the past, we were among the first states in the nation to pass state-level consumer protection laws in the 19 sixties, prohibiting unfair and deceptive acts and practices and empowering our residents to take action against companies that did that wish to exploit them. But our current consumer protection laws are under-equipped for our new AI-driven digital era. This legislation aims to give our consumers the tools they need to take agency over their own data and bring our consumer protection into the modern age. I hope the committee will utilize S 33 as the primary vehicle to advance this conversation in Massachusetts and to deliver a comprehensive interoperable directive for how we handle consumer data privacy in the Commonwealth. Thank you and the committee for hearing my testimony today. Again, make my staff, myself available to answer any questions and, hopefully, work with the committee on this matter. Over there on the other side? So far, so good. It's a lot smaller.
FARLEY-BOUVIER- Yeah. Terrific. Well, thank you very much. Senator, are you good? Is anybody over here? Okay. Thank you very much for your testimony. So, now we're going virtual. Chris Gilrein. You hear, Chris? And then on dates, Katrina Fitzgerald is next. Chris?
CHRIS GILREIN - TECHNET - SB 33 - ( R )Thank you, Madam Chair, Chairman Moore, and members of the committee. I can really appreciate the opportunity to be here. My name is Chris Gilrein represent TechNet. I'm taking the liberty of pasting a link to our members in the chat, but we represent about a hundred companies throughout the innovation ecosystem here in support of S 33 and h 80. Indulge a little bit of history. California is the first state to enact4354 a comprehensive consumer data privacy law. 7 years, multiple amendments to enforcement agencies later, that is still evolving. A Washington state senator witnessed that and sought to develop a data privacy law that was clear and explicit in statute and avoided the hassle and expense of constant rulemaking and reinterpretation. While it did not pass in that senator's home state, it was adopted in Virginia.
Colorado and Connecticut followed and expanded upon that framework. It has been the subject of multiyear public stakeholder meetings in over a dozen states and in hearings and stakeholder processes that include many of the4400 folks that you will hear from today.4402 The result is a comprehensive risk-based framework of consumer rights over how their data is collected and used and who controls the responsibilities to those consumers and to the protection of that enforced by the state's attorney general. It's a model that has been thoroughly vetted and adopted in red, blue, and purple states, including our New England neighbors in Connecticut, New Hampshire, and Rhode Island. This is where I acknowledge that compliance, even with a standard model, is not free. This does come at a cost to companies, including my own. However, by following a model that is already in place, the cost to companies doing business in the commonwealth is orders of magnitude less than if the state advances its own unique standard, which would require custom compliance solutions.
Other comprehensive bills on the docket today include provisions, as you've heard, that deviate significantly from the standard in 18 states. They rely on novel definitions and are enforced with private rights of action, which is something that, as you heard, is not enforced in any comprehensive data privacy regime elsewhere. So we make Massachusetts an outlier, harming our competitiveness and creating significant confusion and cost. There are also a number of standalone bills on the docket to protect certain types of data, whether that be biometrics, location, or neural data, and there are bills in other committees that would safeguard health information. I understand that there may be an instinct after taking hours of testimony today that, to determine this complex subject and maybe an incremental approach is better over the low hanging fruit, like location or health information. But going with a piecemeal approach, we argue, is more complicated from a policy perspective and still requires a complex compliance infrastructure. Leaving, leveraging the work of more than a dozen states and implementing an interoperable standard is4516 the low-hanging fruit. We ask that you advance S 33 and H 80. Thank you.
FARLEY-BOUVIER - Thank you, Chris. Are4524 there questions from the committee? Senator?
MOORE - Thank you. Good afternoon, Chris. A quick question for you. Do you think the AG has the resources to handle the number of complaints that could possibly come in?
GILREIN - So what we've seen, and what we've discussed in other states where we've had this conversation, is, the states can create a separate fund where the proceeds from any successful enforcement action would go. Would you support future enforcement actions? It's a concept that we've discussed in other states. I don't know for a fact that it's4567 been enacted elsewhere, but it's something that we'd certainly be4571 able to do.
MOORE - So, just so, if you'd if we4575 need to create a fund with additional resources, then I would say that the AG does not have the resources would not possibly have the resources?
GILREIN - I would say that if the AG's office does not feel that they do have the resources, that's an avenue that we've seen discussed elsewhere.
MOORE- Okay. Alright. Thank you.
FARLEY-BOUVIER - Thank you, Senator. Anybody else on the committee? Chris, I have a couple of questions for you. Let's talk about data-level versus entity-level exemptions that are in several other states. Why don't you just talk about the difference between the 24615 of them?
GILREIN - Yeah, absolutely. So, the vast majority of states have to as entity-level exemptions. These are industries that are covered under any 1 of the various alphabet of federal data privacy laws, and so, in order to avoid layering state law over laws that these companies are already complying with, they are exempted from the state laws.
FARLEY-BOUVIER - Laws. Okay. So to clarify, for the definition of the members of the committee, the idea is if you are already complying with,4664 for example, HIPAA or GLBA BA, then you're complying with these robust protection laws of the federal government. So why layer the state law on it? Is that what you're saying?
GILREIN - Accurate.
FARLEY-BOUVIER - Okay. So my concern is that GLBA, for example, was created, I think, in 1996. And if I have it right, that is when Motorola first introduced the first flip phone. Okay? Now, I think that it's important to understand how far technology has gone since 1996 and to kind of put that into context. Also, the world of data collection,4715 because we have these4717 devices, you know, at our in our pockets and on our wrists and now fingers and in our homes, collecting data like the temperature in your house or how often you use your lights, all these different things, means that so much more data is being collected. And so, for example, let's say a big bank. Right?
All banks are, you know, have these federal regulations about protecting financial data. Right? That's mostly that's what they're saying. Protect people's financial data. But a big bank like Citibank has a lot more data about me, a lot more data on me than just my financial data. They probably know what I had for breakfast this morning. Citibank does. So, how you can see my concern is that we don't tell Citibank, for example, that they have to protect all the other data they have on me, not just the financial data. If you have a response to that and help us understand why it's so important to you and your members to keep the entity-level exemption. And in your answer, please focus on the consumer of Massachusetts.
GILREIN - Sure. So from Technet's perspective, the support, our support of, the entity level is a matter of consistency across states. And so for that, for the consumer, it is, again, a consistency in how their data will be treated, whether they are commuting to Rhode Island or New Hampshire or what have you. And for your constituents who own businesses, consistency is important in how they are going to be held to a standard.
FARLEY-BOUVIER - Okay. Well, thank you for that explanation. Are there any other questions from the committee? Sure. Representative Meschino.
MESCHINO- Thank you so much. Did you just suggest that because other states and other laws have a lower threshold, Massachusetts consumers should just put up with it?
GILREIN - I'm suggesting that our support is for a consistent standard across the state.
MESCHINO - And you asked about you mentioned cost. So, from your perspective, did I understand you correctly to say that having different standards is costly to you in terms of how you manage them?
GILREIN - Yes. So the support for a kind of uniform from standard, you know, the compliance infrastructure needed to maintain compliance, to design a structure that handles the data appropriately and does the appropriate reporting and disclosure, that costs money, right? It costs attorneys time. And the technology on the back end needs to be adjusted. It is significantly more expensive if something needs to be custom designed4918 for each individual state4920 rather than a solution that fits 18 plus states.
MESCHINO - So a solution could be that Massachusetts could just become the standard to which all comply. Just think about it. Thank you.
FARLEY-BOUVIER - Thank you, rep Meschino. Thank you, Chris. We appreciate your testimony. We're gonna bring up now Caitrina Fitzgerald from the Electronic Privacy Information Center. And do you have a panel? Is it how are we doing this? Or the panels are coming later, baby. The panels are coming later. Okay. Thank you for that. And we're then switching to virtual, and we'll have Julie Burns, Bernstein. Thank you, Katrina. Welcome.
CAITRIONA FITZGERALD - ELECTRONIC PRIVACY INFORMATION CENTER - SB 45 - HB 104 - SB 209 -Thank you. Thank you, Chair Farley Boubie, Chair Moore, and members of the committee. My name is Caitriona Fitzgerald. I'm a Wakefield resident, and I'm deputy director of the Electronic Privacy Information Center or EPIC. EPIC is an independent nonprofit that was founded 30 years ago to secure the fundamental right to privacy in the digital age. I wanna commend the chairs for their sponsorship of these important privacy bills. I'm here to support age 78, the Massachusetts Consumer Data Privacy Act, and S 45, House 104, Senate 209, the Massachusetts Data Privacy Act. As you all know, we're constantly being tracked. Right? Every click, every move, and every like is being collected and put into a profile about us. And at a time when policymakers are concerned about affordability, this is hitting your constituents' wallets. Right? The Texas AG recently sued Allstate for secretly collecting location data from apps like Life 360 and GasBuddy and using it to raise auto insurance rates.
The FTC recently found that retailers are using our personal data, even down to our mouse movements, to offer different prices for the same goods and services to individual consumers. But it doesn't have to be this way. Good privacy laws can encourage companies to innovate on privacy, allow them to reach their customers, but protect consumer rights. I wanted to highlight 3 points that are critical in any privacy legislation the committee is moving forward. First, you need a meaningful data minimization rule. That word's been thrown around a lot. There was mention of it in the Connecticut law. Connecticut law says that companies can only collect and use personal data as they disclose to the consumer, which means they can just bury it in their privacy policy that no 1 reads. The bills introduced by the chairs instead say, it has to be related to the product or service I'm asking for as a consumer. You know, I don't expect that when I visit WebMD's site that what I'm looking at is being sent to Meta and Google and a dozen advertisers in the background, but that's exactly what's happening right now. So Epic supports these bills because they better align companies' data practices with what consumers expect.
And Massachusetts would not5082 be the first year. Maryland passed the Maryland Online Data Privacy Act last year and included a rule that says companies can only collect data as the comp that the consumer is asking for. The second critical piece is a ban on the sale of sensitive data. Maryland included this in their law as well. Data brokers and others simply should not be selling sensitive data. I think we all agree on that. That's why I'd5105 ask that in addition to comprehending the comprehensive privacy bills, the committee also give a favorable report to the Location Shield Act. The third piece is enforcement. It is vital to pair strong AG enforcement with a private right of action. You know, we've seen that in states where there's not a private right of action that the laws just aren't complied with. Massachusetts residents have been able to enforce their consumer rights in court under 93 for decades. This is what our legal system is set up for. There's no reason that privacy laws should be any different, that our privacy5136 rights should be any different. If the AG5138 doesn't, you know, have the resource to bring a case and my rights are violated and the AG can't bring the case, I'm just out of luck, and the company gets to violate the law. That's not fair. So private right of action is really critical to making the law meaningful. So I ask that the committee incorporate those 3 critical points into any bill that it's reporting out, and thank you for the opportunity to testify today.
FARLEY-BOUVIER - Thank you, Katrina. Wanna turn to committee members? Sure. Rep Owens.
OWENS - I just wanna, since again you ended on the private right of action, I just wanted to highlight that. We heard testimony earlier that it doesn't work because consumers don't get compensation. I think I've got probably a check for $14 sitting around or for 14¢- sorry, sitting around from some something or another. How do you how would you respond to that?
FITZGERALD - Yeah. I'd say, unfortunately, it does end up where so many consumers are harmed. So when the judgment gets divided, it is the silly 14¢ payouts, but I'd say they are effective in changing business practices, which is the goal. We've seen this in Illinois.5198 They have a biometric privacy act that was passed in 02/2008 before big techs, like, lobbying machine, privacy act that was passed in 02/2008 before big techs like lobbying machine stood up, so it passed. And that law has forced you to know there's a company called Clearview AI that scans all of our photos online and is building a database of every person on earth. The ACLU sued under the Illinois Biometric Privacy Act, and ClearView AI was forced to stop selling its database to private individuals and law enforcement. So it's also remembered when Facebook used to tag everyone's faces on Facebook by scanning; it stopped that business practice. So, it may not be effective in getting people individual remedies, but it is effective in encouraging compliance in the first place and ensuring that if there are violations that they're remedied.
OWENS - Thank you.
FARLEY-BOUVIER - I have a follow-up to that. We hear a lot that it particularly harms small business. Can you tell us more about private rights actions in small business?
FITZGERALD - Yes. You know,
FARLEY-BOUVIER - How can we let me be more specific in that question? I'm sorry I interrupted. You know, I have obviously true small businesses in my downtown. Yeah. We call it North Street, It's a field. Like, I do want they're they are different Yes. Right, then, Meta. Right? So how do we protect them?
FITZGERALD - Yeah. We recognize that even the threat of5283 litigation is very difficult for small businesses to deal with. I think you both dealt with that appropriately in your bill by exempting small businesses from the private right of action. I think that is a common-sense compromise to say, look, you don't need to worry about this. If there's a small business that's violating the law enough that it rises to the level that the AG is going to enforce, yes, that they should have an enforcement action against them. But that removes kind of that that litigation threat from small businesses, and I think that's a common.
FARLEY-BOUVIER - So can you address the word patchwork? What what is your tell me what you think about all that.
FITZGERALD5325 -5325 Yeah.5325 It's not much of a patchwork because the Washington state law that Chris Colrain mentioned that influenced Virginia and then a litany5335 of other states, I think, you know, say 17 or 18 states have versions of it now, was originally in written by industry. I'd be happy to send articles on this around to the committee, but it's, you know, been revealed that Amazon was a big drafter of that original Washington state law, other big tech companies. So they wrote it so that it basically bakes the status quo into law and didn't give many meaningful privacy rights to individuals. Any rights that they gave them made them as hard as possible to effectuate. EPIC has a report out with the US PERG Education Fund where we graded state privacy laws.
Of the 19 that are out there, 8 got F's, and none got an a. Connecticut, which you've heard mentioned many times today, received a d. The reason for that is the kind of so-called data minimization rule in that law says companies can collect what's necessary for the purposes that they put in their privacy policy, and that's not a meaningful privacy protection. In fact, it incentivizes them to write the purposes in their privacy policy as broadly as possible, so it covers everything. Right? They can say, I collect your data for marketing purposes. That tells me nothing but allows them to do almost anything. So that's not meaningful, and that's why I think we've seen the original5412 sponsor in Connecticut push5414 forward legislation this session to strengthen the Connecticut privacy law to include a data minimization rule that limits collection to what's reasonably necessary for the product or service the consumer's asking for.
FARLEY-BOUVIER - Again, working in multiple states and being very familiar with the bill being currently debated in Connecticut. Also, the big change they are making is the difference, the data level, and exemption versus the entity level exemption. So, we did. I did ask for comments from the last person testifying about that. Can you tell us your thoughts about the difference between data-level and entity-level exemptions?
FITZGERALD - Yes. I think, you know, you made an excellent point that some of these laws were passed, you know, and flip flones were coming out. You know, GLBA requires we've all gotten them. Requires them to be mailed those paper privacy notices that come that that we just throw in the recycling bin. In there, it says we can opt out if we call them. I've tried it. You have to. You can't even do it online. You have to call. It goes through 17, you know, levels of phone pressing the buttons. They make it as hard as possible. So, and then they're also collecting a ton of data that's not even covered by GLBA, you know, the website trackers and whatnot. But by giving an entity-level exemption, I think that there's a balance to be struck there and data-level exemptions. Yeah. I'd love to see exemptions go out the window entirely, but I think data-level exemptions for some of these laws are a good compromise.
FARLEY-BOUVIER - Thank you. I appreciate your testimony, and, looking forward to continuing to work with you throughout the session on this bill. Thank you so much, Caitriona. Yeah. Okay. So we are gonna go to Julie online, and then we're gonna be bringing up Senator Fernandez. Are you here to testify?
SPEAKER7 - Yeah.
FARLEY-BOUVIER - Awesome. So you'll be next. Okay? Thank you so much. Julie?
JULIE BERNSTEIN - CONCERNED CITIZEN - HB 104 - SB 45 - ( R ) Yes. Thank you. Can you hear me?
FARLEY-BOUVIER - We can hear you.
BERNSTEIN - Great. Chair Moore, Chair Farley Bouvier, Vice-Chairs Peano and Hawkins, and Members of the committee, thank you for allowing me to testify in support of the Massachusetts Data Privacy Act H 104 S 45 and strong enforceable data privacy legislation more generally on behalf of the Massachusetts Civil Liberties Group Digital 4. We commend Chair Moore and Chair Farley Bouvier for their tremendous work toward this end. As we witness the roundup of protesters and immigrants by the federal government, we must recognize the constant stream of sensitive personal data, including precise geolocation, social media posts, affiliations, and more, that is legally being sold to law enforcement at the local, state, and federal levels by data brokers. Our written testimony documents how this massive transfer of data includes AI analysis that characterizes us based on our expressed opinions and affiliations that are often flawed but are nonetheless being used to assign us guilt for speech that the government identifies as being either a terrorist supporter or in conflict with its5598 foreign policy. The Consumer Financial Protection Bureau and the Federal Trade Commission previously penalized businesses that abused our sensitive personal data.
The CFPD has recently been incapacitated. FTC chairman Andrew Ferguson declared that the agency would, quote, stop abusing FTC enforcement authorities as a substitute for comprehensive privacy legislation, unquote. The CFPB and FTC will no longer protect our private data. The California and Texas attorneys general have each enforced state data privacy laws against violating companies. State enforcement will be imperative going5635 forward. Any data privacy bill must include detailed requirements for data rights to view, correct, and delete collected data, obtaining consent for each transfer of an individual's sensitive data, and a private right of action. Digital Fourth would like to highlight additional valuable provisions from the Massachusetts Data Privacy Act. Section 93 provides especially strong protections of geolocation data, including a warrant requirement for sharing such data with any federal, state, or local government agency or official. We support the Location Shield Act, which is based on 93 n.
The MDPA and the Massachusetts Consumer Data Privacy Act both require companies to give individuals a list of all third parties to which their data has been transferred. The MDPA further requires that OCABR host a searchable website with the contact information for any third parties who are data brokers. The MDPA requires that covered entities present clearly on their homepage how to opt out of targeted advertising or profiling and how to assert consumer rights. It requires that privacy policies be detailed and include information. NDPA compels separate privacy policies for biometric and geolocation data. It requires that the attorney general publish on a website how to report violations, reported violations of consumer rights, investigations made, and the outcomes of legal actions taken. Please send forward the strongest and broadest possible data privacy legislation. Thank you.
FARLEY-BOUVIER - Thank you so much, Julie. Appreciate your testimony today. We're gonna now go out of order and bring up Senator Dylan Fernandez. And then, on deck is Megan Evans, who is joining us online.
SEN FERNANDEZ - SB 36 - Alright. Well, thank you, madam chair and mister chairman and honorable committee members for taking me out of turn. I'll be brief. I'm here to testify in support of our legislation h or s now, 36 and 36 and act to provide accountability in the use of biometric recognition technology and comprehensive enforcement. You know, this is technology that has vast impacts on our society, but there are very few laws guiding it. And I think that's probably why there's such a popular hearing today. There are very mundane uses for this technology, such as scrolling through photos or opening your phone. But companies are also using facial recognition and other biometric data to track our movements and effectively end privacy as we know it.
Massachusetts as a state has begun to consider regulating government use of this technology. And while that's a good start, what actually scares me more is the unfettered use of private companies being able to use this data, our data that's most personal to us. And as facial recognition technology has improved, coupled with even greater saturation of cameras, private companies love the ability to track our movements without our permission or knowledge and sell our facial data to other companies. And there have already been instances, well-publicized, of fans being denied entry to concert venues based on the whims of a sports stadium owner after being identified by facial recognition scans. We are in a world where you can walk into a store, and the store cameras can match up with your face, with an online profile, with data about your buying power and likely preferences, or where your face is tracked across public places, and that data is sold to market to you.
And maybe that doesn't sound like a problem, but what if a store used or a company used facial recognition data to match your face to an online profile that projects your likelihood of theft or uses tracking data to deny you insurance or a loan or some other service? So, we should all have a right to privacy. Companies that have compiled these databases with billions of pictures of our faces should have some regulations. And so what we're proposing is something that's, you know, this is a hard thing to regulate, I think. I know a lot of states are grappling with this. You know, the federal government's kind of trying well, maybe someday we'll hopefully grapple with this. We're just proposing that we implement a fiduciary responsibility on these companies. We've seen that the fiduciary responsibility actually works quite well in the financial space when it comes to protecting people's best interests. And so we think that model can be used here in our data privacy space in a way that puts the consumer or the person in your most personal data and proper care. And so that's what this legislation simply does. It kind of imposes this fiduciary responsibility model that works really well in this other sector with some flaws, but decently well, on this space that sorely needs some regulation. So with that, thank you so much for taking me out of turn, and, happy to answer any questions.
FARLEY-BOUVIER - Sure. Thank you so much, Senator. Anybody here? Thank you, Senator. We appreciate your testimony today. We're now gonna go virtually to Megan Evans, and on deck will be Ben Winters. Megan?
MEGAN EVANS -ACOG - HB 86 - SB 197 - ( R ) Hi. Thank you. Can you hear me, okay?
FARLEY-BOUVIER - We can. Thank you so much.
EVANS - Great. Committee Chair Farley Beauvais and Chair Moore, as well as members of the committee, thank you for the opportunity to present testimony in support of House Bill 86 and Senate Bill 197, the location shield act. I'm an OB GYN practicing in Boston, and I'm here speaking on behalf of the American College of Obstetricians and Gynecologists, for which I am currently the Massachusetts vice chair. The Massachusetts section of ACOG represents more than 1,400 OB GYNs who are committed to providing safe, evidence-based health care for the women and patients in our state and the patients who travel for care within our borders. I wanna first thank our legislative colleagues and their commitment to protecting reproductive health care within the Commonwealth. We have worked as partners over the years on a number of legislative priorities, including the passage of the Roe Act and, recently, a comprehensive maternal health bill.
Massachusetts continues to serve as an example across the country of what access to comprehensive life-saving and equitable reproductive health care looks like. However, recently, we have seen the anti-choice movement change its tactics to attempt to curtail access in states like Massachusetts. Recent reporting demonstrates that the threat of location data brokers selling information about abortion patients and providers is already happening. The widespread availability of granular cell phone location information enables anyone with a credit card to purchase and track the location data of people seeking or providing reproductive health care in our state. As long as the cell phone location market remains entirely unregulated in Massachusetts, we as providers and patients that seek care continue to face the risk of personal harm. This is especially true now that digital privacy by drawing geofences around sensitive locations like clinics and hospitals can not only have the unintended consequence of actually undermining the protections the legislature has passed for those in our profession but also fails to secure location privacy for each and every person seeking critical care.
This includes victims of domestic violence seeking aid and shelter and patients seeking gender affirming care and abortion services, especially those traveling from hostile states. Passing the Location Shield Act is critical to mitigate the potential for hostile actors, whether they be anti abortion extremists, abusive ex-partners or spouses, or anyone else from weaponizes our personal information to further their dangerous agenda. As reproductive health care providers, we are concerned not just about6140 the privacy of patients and their families but also our safety and the safety of our colleagues.6144 We feel strongly that every citizen of the commonwealth and any person who may travel, work, worship, or seek health care or shelter here deserves location privacy. This is precisely why Massachusetts ACOG has endorsed the location shield act again in this session. Thank you for your time.
FARLEY-BOUVIER - Thank you so much, doctor. Does anybody have any questions? Thank you. We're now gonna go to a panel with Ben Winters, Matt Schwartz, and Nicole Gill. Very welcome here. And on deck are Professor Woodrow Harzog and Lisa Lavazier. Thank you so much. Thanks.
BEN WINTERS - CFA - HB 78 - HB 86 - Chair Moore, chair Farley Bouvier, and members of the committee, thanks for the opportunity to testify in strong support of HB 86, the Logation Shield Act, HB 78, the Massachusetts Consumer Data Privacy Act, and HB 99, which would address, surveillance pricing in grocery stores using biometric information that is, immutable. My name is Ben Winters. I'm the director of AI and privacy at the Consumer Federation of America or CFA. CFA was established in 1968 and has over 200 members that are state and local agencies and other nonprofits that work in the public interest through education, advocacy, and research. So right now, what we're seeing is that consumers are facing a mosaic of unfair and deceptive trade practices that pervade life both online and off. Consent is rarely truly informed when you are breezing through these agreements or just getting to the next thing. And you need to go on Google Maps.
You're not gonna be able to try to do a research project to figure out whether you consent, and then you don't have any choice when you go there. Device and location information is collected and sold without notice, consent, compensation, accountability- anything. And biometric information, like your eyes, face, and thumbprint, that you have no control over or ability to mute, is collected and used in opaque ways that could affect the price you pay for groceries or if you are targeted for ads or any fill in the blank. So, Massachusetts residents deserve better, and I think y'all know that. I think that is evident by the fact that we have, whatever, 20 bills on today that are all on this topic. But in particular, I'm urging out to prioritize HB 78, which is a strong comprehensive consumer privacy bill that would create clarity, not just for the companies and the entities doing the collection of data but for consumers. Right? So you don't have to try to figure out the maze of it. Even as a consumer protection attorney, I could not even hope to try to understand every way my data is being used. And again, even if I did, I would have no control over it. So it's good to get clear baselines for both the businesses, small and big, as well as consumers, everyday people.
But particularly, this bill does 3 great things. 1, it gives consumers a specific common sense right to know who has their data, delete data, ask them to delete data that's held about them, and opt out of certain targeted advertising as well as profiling that influences important credit or employment or housing decisions. 2, it just straight up restricts entities from collecting data that they don't need to provide the service that you're requesting. And 3, it gives a private right of action. It lets people that are hurt sue to make themselves whole. As we have talked about, the attorney general, while awesome and great, would never be able to get every single case. And, also, the incentives are off. If you are harmed by somebody, you should be able to sue and get your day in court and get the sort of resolution you need. If comprehensive laws cannot something that can be passed right now, I urge y'all to do the sensitive location data. 92% of Massachusetts residents support a ban on the sale of location data, and it's clear why. Whether it's a bad actor like an abuser or someone trying to get you to buy something, it's invasive, unaccountable, and unacceptable. So I'll follow up more in writing on why I support those as well as the Surveillance Pricing and Groceries Act. But I see my time's up, and I'm happy to answer any questions later. Thank you so much.
MATT SCHWARTZ - CONSUMER REPORTS - HB 78 - SB 45 - HB 104 - SB 29 - Chair Moore, chair, chair Farley Bouvier, and members of the committee, thank you for allowing me to testify today. Thank you for all of your work to advance strong privacy protections for Massachusetts residents. My name is Matt Schwartz. I'm a policy analyst with Consumer Reports. Like Ben, I'm here in support of6399 a few bills: H 78, the Massachusetts6401 Consumer Data Privacy Act, as well as S 45, H 104, S 29, and6407 the Location Shield Act. These efforts together indicate that the committee is already well aware of the importance of strong data privacy rules. As you've heard, companies are increasingly surveilling our every move, both off and online. This has led to an erosion of the basic expectation of privacy as long as well as concrete harms that can negatively impact consumers in a variety of ways, including financially. That's where we're glad to see that Massachusetts is taking steps towards giving consumers more control over how their information is collected and used. We support H 78 in particular for 3 main reasons.
First, the bill has strong data minimization provisions. As you've heard, data minimization is the idea that companies should really only be collecting and using data that's necessary to provide the service that consumers ask for. We need this protection because companies are routinely collecting and using data in ways that are adversarial to consumers. Neither an opt-in nor opt-out issue on its own can solve that. You know, for example, my weather app, obviously, unrelated purposes. We've heard, industry argue that, you know, companies wouldn't necessarily know how to comply with such know, companies wouldn't necessarily know how to comply with such a framework and that the legislature should instead adopt the Connecticut standard to give them clarity.
But that standard doesn't actually create any new requirements for companies and simply states that businesses6509 can't lie about what's in6511 their privacy policy. That's already the6513 law. As we know, those privacy policies are designed to be as vague as possible. So, we need privacy laws to do more. Second, the bill creates heightened protections for sensitive data, including an outright ban on the sale of data, like our cell phone location data, religious and political beliefs, health information, and information collected for minors. This information can be easily purchased and weaponized by a number of folks, stalkers, scammers, and malicious political actors. We note that the Location Shield Act would provide similar protections for location data in particular. We urge the committee to advance that measure as well. And then, finally, we note that the bill would be enforced in part by a private right of action, which we believe is appropriate. Under the 13, state privacy laws that are active Currently, there have only been a handful of public enforcement actions. At the same time, you don't have to look very hard to find obvious violations of those laws. CR recently released AG enforcement has not markedly improved privacy outcomes for consumers, so we need a more balanced approach. So I'll wrap it up there. Thanks again for your time. Happy to answer any questions.
FARLEY-BOUVIER - Thank you. And I understand 1 of your panel mates is joining us online. Nicole, are you online?
SPEAKER28 - Yes. Trying to get my camera on.
SPEAKER7 - Okay.
SPEAKER28 - Great.
SPEAKER1 - Okay.
NICOLE GILL - ACCOUNTANT TECH - HB 78 - HB 86 - ( R ) - There we go. It worked. Okay. Woo-hoo. Yeah. All right. Chairs Farley, Bovier, and more, and members of the committee, thank you for inviting me to testify. My name is Nicole Gill. I'm a resident of Waltham, Massachusetts, and also the cofounder and executive director of Accountable Tech, a nonpartisan nonprofit that advocates for safer online platforms. I am here, on behalf of my organization, to ask the committee to favorably report out 2 bills that would provide comprehensive privacy protections for Massachusetts residents: the Mass Consumer Data Privacy Act, H 78, and H 86, the location shield act. Now I'm calling a nodible. I was supposed to talk about geolocation. Instead, I actually wanna tell you about the app that my children's daycare and school uses. The app was introduced about a year ago. Many of the parents are thrilled. Right? Because we all use apps for everything. And from the school's perspective, I understand why it's nice to kind of centralize all of your communications, the sharing of information.
However, the app is used to share health information, including, you know, those forms you get from your pediatrician outlining everything that has to do with your child's health concerns. They share pictures and videos in the app. And I've read the terms and conditions. I'm probably the only person who read the terms and conditions, and there's a million loopholes. Right? The company advertises itself as not selling its data. However, they are free to give it to a marketing partner. They are explicitly using it for what they call personalized advertising. So, surveillance advertising. Consumer not consumer parts. Sorry. I'm trying to remember this. Common6747 Sense Media and EFF,6749 the Electronic Frontier Foundation, both have reviewed the app and noticed and noted that it has significant privacy concerns. Also, data from parents and educators working with young children is incredibly valuable. A researcher a decade ago found that identifying a single pregnant woman online is worth as much as knowing the age, sex, and location of up to 200 people. I can only imagine that that has increased in time. So, yeah. I'll stop there. Thank you.
FARLEY-BOUVIER - Thank you very much, Nicole. And I just think that that was a personal story that everybody can relate to. Right? We talked about the daycare app. I wanna ask a question. This is a pretty yes or no. Maybe call this lightning round question for the 3 of you. Are you Republican, Democrat? Like, where do you where do you sit when it comes to partisanship? Each 1 of you.
WINTERS - We're a nonpartisan organization. We're a nonprofit.
SCHWARTZ - Likewise. Nonpartisan. Nonpartisan. Nicole?
GILL - Yeah. We're also nonpartisan.
FARLEY-BOUVIER - Okay. So this really doesn't have anything to do with left or right when it comes to this.
GILL- No. It has to do with privacy and safety.
FARLEY-BOUVIER - Well, I appreciate the perspectives that each 1 of you bring to this, and the fact that you are putting the consumer at the center of it is really helpful. So, as announced before, we are gonna go to our next panel. This panel is online. I'm sorry. 1 question. Oh, I'm so sorry. Sorry. I Yeah. Please.
OWENS - I've actually asked this question for Nicole. I wanna talk about your app for a second. Not your app. I'm sorry. You said your parents are downloading or uploading health information, stuff from pediatrician immunizations? Yeah. Would Yeah. Would you say an average person would assume that that is covered by HIPAA? I don't mean to, but put your place in the right.
GILL - It might be, but HIPAA often refers to the use of the sharing of medical data for medical purposes. You know, it's not clear to me that it is covered under HIPAA.
OWENS - My understanding is that it is not. Although, when I talk to people about it, yeah. Medical data, they assume that it is. And so that is a just to just to point out that red flag for me on that.
GILL - The bun, fortunately, has a lot of loopholes. Yeah. To it. And app developers are smart. They figured them out, and they're utilizing them. But the school needs to know about my daughter's nut allergy. The Internet broadly does not.
OWENS - Thank you. Fair.
FARLEY-BOUVIER - Thank you. Thank you all so very much. We're gonna go online to the next panel of professor Woody Herzog, Lisa Levasseur, and Eric Noll. And on deck is our colleague, representative, Simon Cataldi. And I just wanna make sure we welcome representative John Gaski, who's joining us for the committee. Thanks for joining us here today. Professor?
SPEAKER29 - Thank you so much. Can you hear me, okay?
SPEAKER1 - We can. Perfect.
WOODROW HARTZOG - BOSTON UNIVERSITY - HB 104 - HB 78 - ( R ) Fantastic. Thank you. Dear committee members, thank you for allowing me to provide remote6964 testimony today. I appreciate it. My name is Woodrow Hartzog, and I'm a Newton resident6968 and a professor at Boston University's School of Law. I have6972 spent the last 15 years researching and teaching data privacy law, and building on that today, I recommend 3 things. First, I recommend that this legislature favorably report on H 78, the Massachusetts Consumer Data Privacy Act, and H 104, the Massachusetts Data Privacy Act. These bills contain the 3 most important features of any effective privacy bill: a substantive data minimization provision, bright line prohibitions on the most dangerous data processing, and so with these statutes, Massachusetts has a chance to pass the first truly effective privacy law in the country. Big companies collect everything they can about us for profit. That's why strong data minimization rules are so important.
They limit data collection to what's necessary, not whatever self-serving purpose a company can get away with by bearing it in the fine print. For privacy regulation to be meaningful, it must also be consistently enforced. State attorneys general do amazing work, but they can only do so much. The only thing that can meaningfully hold Big Tech accountable is a private cause of action combined with AG enforcement. Anything less is a gift to large tech companies. Representatives for big tech companies have told you today that this bill is too burdensome and will be bad for the industry. They've been singing this, I think, misleading tune for years, and their lack of accountability is what got us into this mess in the first place. Place. Responsible businesses do not need to exploit the people of Massachusetts with impunity in order to thrive. My second recommendation is to favorably report on H 86, which protects location privacy. Every day, companies sell our precise location data to random strangers on the Internet.
This is 1 of the most dangerous kinds of data that you could possibly sell. It exposes our movements, our beliefs, our health status, and so much more. In our modern dystopian world of vanishing reproductive rights and vanishing people, it is vital that the location data of the people of Massachusetts be protected. Finally, I want to encourage lawmakers to favorably report on H 90 9. Grocery stores7113 have started experimenting with surveillance pricing, which is the practice of using people's data trails to give them different prices than everybody else. Companies do this for 1 reason, to tighten the screws on customers and figure out the highest price that we are willing to pay when we walk into a store. This disastrous7130 practice leads to price gouging,7132 discrimination by proxy, inability to budget, data7136 hoarding, and the suffocating feeling that every choice you make while shopping is a trap. This bill would prohibit food stores from using tools like facial recognition to charge people different prices and provide vital breathing room for customers while still allowing people to receive discounts. In conclusion, Massachusetts has in front of it today the perfect opportunity to lead the country once again in protecting people's privacy while fostering an innovative and flourishing marketplace. Thank you for the opportunity7165 to testify today, and I look forward to your questions.
FARLEY-BOUVIER - Thank you, professor. Would you just mind taking a moment to talk about where you teach and what you do in this space?
HARTZOG - Sure. So, I teach at Boston University School of Law. I teach courses on information privacy law, advanced privacy, and I hope some of my advanced privacy students are watching today, and digital civil liberties. And I've been writing about data privacy rules now for quite some time.
FARLEY-BOUVIER - Thank you, professor. And now we have other people on the same panel that are also online, so we're gonna turn now to Lisa. Are you there, Lisa?
LISA LEVASSEUR - INTERNET SAFETY LAB - HB 78 - HB 104 - ( R ) Yep. Sorry. Looking for the unmute. Hi. Thank you so much, Chair Moore and Chair Farley Bouvier, and members of the Committee. My name is Lisa LeVasseur, and I'm the Research Director and Founder of Internet Safety Labs. I call it ISL. I'm here today to support H78 and H104. ISL is a nonprofit, nonpartisan, digital product safety testing organization that studies safety and privacy risks in websites and mobile apps. We've done extensive empirical research in quantifying and identifying risky third-party data sharing in apps. In 2022, we studied more than 1,700 apps recommended or required by K-twelve schools across the U.S. We've subsequently developed and published robust safety labels that show all of the third-party companies observed to be receiving data from those apps, including identifying the registered data brokers.
Last year, we published research describing the global commercial surveillance infrastructure identifying entities, customer data platforms, and identity resolution platforms that allow marketing and advertising companies to aggregate millions of personal data elements data elements from disparate sources through bulk customer database transfers as well as transactionally through digital advertising with the inclusion of proprietary personal identifiers conveyed in the real time bidding protocol. That's a mouthful. We're techy. We're super techy. We dig under the hood. Both methods may qualify as the selling of data. Of the combined 360 customer data platforms and identity resolution platforms we looked at, only 16.4% of these platforms were registered data brokers. Many more of them should be. Note also that these platforms indiscriminately hoover up and monetize personal data without regard to age. 539, or 35% of the K-twelve apps in our 2022 research, have sent data to these commercial surveillance platforms. We are delighted that the MCDPA and MDPA provide 2 crucial measures to make digital products safer. First, meaningful data minimization. In 2022, we published 10 principles for safe software.
Principle number 4 is data collection minimization, where data collection must be proportional to the deal being established between the product and the consumer. This kind of contextual proportionality is vital for meaningful data minimization. We're happy to see the alignment in H78 and H104. It's 1 of the easiest and best ways to keep people safe when using digital products by not collecting the information in the first place. Secondly, a prohibition on the sale of sensitive personal data is a much-needed and profoundly important measure, and we're happy to see it in age 78. The selling of sensitive personal data has been shown repeatedly to present physical, emotional, and reputational risks to individuals and groups. We believe personal data markets are profoundly risky to individuals in society, especially children. The proposed H78 definition of sensitive data, plus the prohibition of the sale, is an excellent start to keeping everyone safer. ISL and our empirical research are at your disposal. Thank you for the opportunity.
FARLEY-BOUVIER - Thank you so much, Lisa. We'll turn to Eric, and then when people have people can have questions for the panel. Eric?
ERIC NULL - CENTER FOR DEMOCRACY AND TECHNOLOGY - HB 104 - ( R ) Hello. Thank you very much, co-chairs Moore and probably Bouvier and members of the committee. I appreciate you inviting me to speak to you today on 2 important privacy bills. The Massachusetts consumer data privacy act, age 78 and Massachusetts data, data privacy, sorry, consumer data privacy act, age 78 and data privacy act, H 104. And thank you for your clear hard work on this important issue. My name is Eric Null. I'm the co-director of the privacy and data program at the Center for Democracy and Technology. A 30-year-old non-partisan non-profit organization focusing on protecting individual rights, civil rights, and civil liberties in the digital age. While there are many great aspects of these bills, I'll focus here on 3 issues. 1, moving us beyond the notice and consent regime to a data minimization regime. 2, 2, protecting civil rights. And 3 ensuring strong enforcement. First, 1 of the primary goals of privacy legislation should be to move us beyond the failed notice and consent regime.
Which has been dominant since the 1990s and ultimately places the privacy burden on already overburdened individuals. However, we know people don't view privacy policies as effective or useful. We know people don't read privacy policies, and we know that privacy policies, if people did read them, it would require hundreds of hours per year to read. As a result, people have a sense of futility and feel a lack of control over privacy risks, and they often underestimate the risks of disclosing their data. Both bills CDT supports would shift the primary burden to the companies who benefit most from the collection and exploitation of that data. The bills accomplish that through strong data minimization provisions that, unlike many other states, require companies to justify their data practice in the first instance. Data minimization helps prevent privacy harms at the outset because data a company does not have cannot lead to downstream harms through misuse, unauthorized access or disclosure, or some other harmful action. Data minimization is also bipartisan. A recent Consumer Reports survey found that 72% of Republicans and 79% of Democrats support a law that limits companies to using only the data they need to provide their service.
Second, these bills provide increased civil rights protections. Privacy rights are civil rights. A privacy law should protect civil rights because we have already seen data being used in a discriminatory way, particularly through the training of and decisions made by algorithms. For instance, credit scores and the factors used to calculate them are deeply correlated with race. According to the Brookings Institute, black and Hispanic individuals are much more likely to have credit scores below 620 than white individuals. And facial recognition software exhibits similar biases leading to the misidentification and wrongful arrests of 3 Black men, at least 3 Black men: Robert Williams, Najia Parks, and Michael Oliver. Last, privacy laws are only as strong as their enforcement. They should be enforced through multiple channels. Both of these bills provide the Massachusetts attorney general with rule-making authority and civil penalty authority and provide individuals with a private right of action. That way, both the state and individuals can ensure that privacy is protected. To ensure proper enforcement, the AG should be appropriated enough funds to build a dedicated office and team like in Texas. Thank you for your time. I look forward to answering any questions you have.
FARLEY-BOUVIER - Thank you so much, Eric. Are there any questions for this panel? So that would be the professor, Lisa, or Eric. Rep. Fotolo.
VITOLO - This is a question for anyone on the panel. You didn't testify specifically about this, but I trust that your expertise may allow me some wisdom. I have a 10-year-old daughter and a 14-year-old son. Their experiences on the Internet are very different from each other. They're both minors, but they're very different. Can you give me some guidance about how we ought to think about different age groups of young people and their data privacy or needs above and beyond minimization? Or at least can you point me offline to some resources to better understand how to think about different ages of young people, different cohorts, and how to protect them, especially because they age out of the cohort they're in, and that makes it even more complicated.
FARLEY-BOUVIER - Thank you. Since you're not there to nudge each other with your elbows and, on the table in front of us, is anybody I just invite you to pipe it.
HARTZOG - I'm happy to talk about it for a little while. Thank you for that question. So it's a really important 1. And I've long said that, while it's important to be able to have protections for kids, it's important not to sort of understate the overall danger. And so there's a lot of protections that I would advocate for that would apply to all ranges of children, particularly those involving7738 data collection or actually the design of certain information technology. So anti-engagement, anti-dark-patterns strategies. People should be protected at every age and no matter what choices they make online. And so I think that by a heavy focus on privacy, you're actually able to be more responsive to a broad set of needs that are less sensitive to differences, in ages, at least with perspective, the bills that we're considering here today. I hope that helps. I don't know exactly how responsive that was, but I hope it was at least somewhat responsive.
LEVASSEUR - Yeah. I wanna second that. You know, we say on the internet, everyone's a child. We believe that everybody deserves strong protection, strong privacy protections. Yeah. So.
FARLEY-BOUVIER - Thank you. Anybody else from the committee? So, Lisa, I just have a quick question for you. Like, what incredible work that you do to be tracking all these apps. I mean, just counting all the apps would be a hard thing to do. But is it there on your website? Is there a resource for people to go in a consumer-friendly way to learn learn about7815 these apps to say what's happening with this app?
LEVASSEUR - Yeah. Appmicroscope.org is the web service that we have that displays the safety labels. The safety labels that you see right now are heavy on privacy risk. We're adding in dark pattern risk. We're adding in other kinds of risks. But right now, they're heavy on privacy risk, and you can see them, and we're adding new ones every day.
FARLEY-BOUVIER - Would you just say, talk about what dark patterns are?
LEVASSEUR - Yeah. So, dark patterns are techniques that software developers deploy in designing the user interface that consumers interact with. These are patterns in the sense that they are repeatable techniques. So here's 1 that you've all seen. The consent to cookies. The button is big and green on the bottom of your screen, and if you wanna change it's maybe in shaded gray, hard to find a link somewhere else. Right? That's a dark pattern. It's funneling you down a particular path to take a particular action. We have been, in our software safety standards panel, we have been, aggregating work from academia and from laws that enumerate and define dark patterns, and we've come up with a list of some nearly 70 atomic-level dark patterns that have been defined that are forms of manipulation. Really, that's the way to think about it. They are forms of manipulation that have been codified so that they're repeatable. And you've seen these in like infinite scroll is a highly addictive pattern, and people know this, developers know this, they've known this for quite some time. Same with the like button. It is an addictive, dark pattern. So, we are cataloguing and categorizing those, and we will make those available in the safety labels as well.
FARLEY-BOUVIER - Thank you very much. Really appreciate the testimony of this panel. And hope that you will continue to engage with the committee. I want to bring up Rep Cataldo. And I understand you have some people with you. Rep, if you want to introduce. And as you're coming, to, up front, we want to welcome representative, Lombardo. Thanks for being here. Nice to see you. Representative.
REP CATALDO - HB 103 - Madam chair? Yes. Chair Moore, thank you so much for the opportunity to testify, and thanks for taking us out of turn. Online should be, Sean Diamatos. Okay. And here is Sean Kaczowski with me. He's coming all the way from Colorado today. This committee has heard some terrifying testimony, and, sorry to say it's about to get worse. When you hear what my colleagues have to say. The bill was then summarized briefly by the chair, a Vargas representative. So I'm not gonna go into that again, and I'm gonna turn it over to them in a minute. I just, you know, the only point that I would make before they begin is that the hands in which this technology exists now are not necessarily the folks that I would trust with data about my brain. So this is H103. It's called an act to establish the Massachusetts Neural Data Privacy Protection Act, and we drafted it in such a way that I think it could be neatly tucked into any of the more comprehensive bills on data privacy that have been proposed. And happy to work, of course, with the committee on that. But I will turn it over to Sean.
SEAN PAUZAUSKIE - THE NEURORIGHTS FOUNDATION - Alright. Thank you so much, Representative Cataldo. Thank you, Madam Chairwoman Farley Boubier and Senator Moore. My name is Sean Pauzauskie, and I'm here today on behalf of the nonprofit Neuro Rights Foundation. I serve as medical director. I'm a practicing neurologist and clinical researcher in Colorado, and I also conduct clinical research with companies that are employing neurotechnology in a consumer-driven space, including some in Massachusetts. So, thank you for the opportunity to testify. While we neurologists tend to be a little cerebral, not to make a pun, I wanna speak to you today.
FARLEY-BOUVIER - Thank you for that, man. It's all men.
PAUZAUSKIE - Yeah. But, today,8096 I wanna speak to you from the heart, just8098 to talk about this issue. So8100 my interest in this issue comes from I was approached by my research office about doing8104 a research project using a consumer neurotechnology device that anyone can buy8108 off the Internet today, have it shipped to you, to do clinical research, because the data collected from these devices, is medical grade. And that just struck me as somewhat concerning. And so, that's kind of what started me on this path towards, you know, this issue that we probably define as neuro rights, including the right to mental privacy. There are approximately 30 of these companies across the country collecting this neural data, including in Massachusetts, and doing so with things as simple as a headband or earbuds.
And as I mentioned, this data is medical grade, but we enjoy the protections of HIPAA in the clinic and in the research setting, but that's not the case, for the citizens, more broadly, including up to 3000000 in Massachusetts who suffer from anxiety and depression, 450000 with schizophrenia, hundreds of thousands with cognitive disorders, 70000 with epilepsy, all at risk of undue identification, and possibly having that data sold to third parties for purposes of manipulation, or possibly discrimination. Some of the most sensitive data that you could possibly share with anyone is data coming from your brain. My concerns go beyond this into more of the realm of mental privacy, more generally speaking. We know that in the laboratory, you can decode thought to text, using the same kind of data with up to 40% accuracy today.
If things continue to advance in a few years, and with, the big tech companies getting ready to introduce products that, can collect, you know, brainwaves from simple, you know, ear pods or, you know, from your wrist, the level of your wrist, that hundreds of millions of people could be at risk of having this data taken and even their thoughts decoded in several years if things progress at the rate that we anticipate. So I'm not just here worried about my patients, but about things more broadly that if this data was aggregated, that it could be used to even decode the thoughts of people across the country in the coming years. And so I think that h 103 provides us with a simple, common-sense solution to this, providing robust protections that this sensitive data deserves, and I urge your support. I would be happy to answer any questions.
FARLEY-BOUVIER - Thank you so much. So we're just checking in to see if your other panelist is online. Sean, are you online with us? Didn't see him listed.
SPEAKER34 - Oh,
SPEAKER1 - here we go. Hi.
STEPHEN DAMIANOS - THE NEURORIGHTS FOUNDATION - ( R ) - Yes. Yes. This is Steven Damiano. Hi. Yes. Chair Farley Bovier, Chair Moore, members of the committee, and Rep. Cataldo, thank you very much for your consideration of this important bill and your leadership in this space. My name is Steven Damianos. I am the Executive Director of the NeuroRights Foundation, which is a nonpartisan 501C3 nonprofit dedicated to promoting the ethical innovation of neurotechnologies. To be clear, when we're talking about neurotechnologies, we're talking about any device that is capable of either monitoring or manipulating the activity of the brain and the wider nervous systems. At the NeuroRight Foundation, we believe in the exciting potential of emerging neurotechnologies, and we want to make sure that these technologies are designed and deployed responsibly. Our goal is to leverage these technologies for social good while safeguarding them from misuse. My colleagues and I are strongly in support of this bill. Neurotechnologies have existed in medical settings for decades but are quickly proliferating into the medical settings for decades but are quickly proliferating into the consumer space. When used in medical settings, they are heavily regulated by health privacy laws such as HIPAA and others.
But once taken out of medical settings, neurotechnology is considered consumer-facing and not subject to any regulation at all. The8318 reason this gap is so concerning is because of the extreme sensitivity of neural data. When paired with generative artificial intelligence algorithms, neural data are capable of revealing deeply intimate information about consumers, including information about their health, mental states, emotions, and cognitive processing. And as my colleague Dr. Pauzauskie just said, there's8342 increasing ability also, you know, in addition to accessing diagnoses about8346 neuropsychiatric disorders, also the ability to decode human thought8352 from these devices. At the NeuroRights Foundation, my colleagues8356 and I conducted extensive research studying the privacy practices and data rights of consumer neurotechnology companies, and we're deeply concerned to find widespread deficiencies and enormous gaps between the data protection standards globally and8370 actual industry practice. This isn't a problem just for the future. I want to emphasize that this is a problem.
Consumer neurotech is not exactly mainstream yet, but it isn't fringe, either.8382 For example, 1 leading neurotechnology company claims to have already collected over 100000000 minutes of consumer neural data. So it's approximately 1700000.0 hours, while another leading neurotechnology company shared with us that they have 1000000000 minutes, which is 17000000 hours.8399 And, of course, you could train an algorithm on8401 just a fraction of this. And I'll note that tech giants like Meta and Apple are in the process of developing8407 and commercializing their8409 own products as well. And so, there's an enormous need to safeguard this space, and this bill would have important implications for Massachusetts consumers and companies that want to sell their products in the Massachusetts market. I'm happy to answer any more questions, and I'll just note quickly that Colorado and California recently passed similar amendments. And just last week, Montana passed legislation, passing the senate on a vote of 49 to 1 and the house on a8437 vote of 100 to 0. And so there's growing bipartisan recognition of the importance of this type of effort. Thank you very much.
FARLEY-BOUVIER - Thank you. Thank you to all the members of the panel. Are there questions from the members of the committee? Senator.
MOORE - So what is this so help me. What would this be used to detect for your fifth?
PAUZAUSKIE - Yeah. So today, it could be used, for example, if you had, I'll just take probably the most, you know, defined example of epilepsy. So epilepsy is not just as simple as you're having a seizure or you're not, but brain patterns, easily detectable from things like consumer brainwave data, could be detected from this data. So if I had epilepsy and I'm I'm using 1 of these products, my data gets sent to the company. They could easily decode that I had epilepsy and perhaps use that to sell me whatever I might need for that or to use that to, you know, sell to insurance companies, things of that nature. That's just 1 example.8496
MOORE - Alright. But this device that's being used is being prescribed by a physician.
8502 PAUZAUSKIE8502 -8502 So8502 these are consumer devices. That's a8504 great distinction. These are the same8506 types of data that I collect in the clinic and the hospital every day but that are available through Amazon or the company websites.
MOORE - So I guess if I wasn't going to my primary care or a specialist, why would I purchase 1 of these these equipment?
PAUZAUSKIE - Yeah. Exactly. So a lot of them are being marketed as wellness devices for things like8525 meditation, for sleep, to8527 enhance your mood, to help your cognition, and just growing more and more, applications8531 for, the reason that there's this loophole is because of the8535 wellness category, of device. So you can get a brain wellness device now that uses the same types of data that we use in the clinic.
MOORE - Okay. Alright. Thank you.
FARLEY-BOUVIER - So just to clarify, nothing about this bill or the other data privacy bills being heard today is stopping this advanced technology. It is about protecting the data. That's what you're bringing to us today. We're not saying it shouldn't be consumer. You're not asking it for not to be at the consumer level. Level. You're asking that the data be protected.
PAUZAUSKIE - Exactly. Great distinction. Yes. In fact, our first8572 stated mission is to promote innovation, and8574 we feel like that we can do that by both lending structure to an emerging industry and also really legitimizing this technology through things like legislation so that people are more aware of it.
SPEAKER16 - Thank you. Sure.
MOORE - So, why wouldn't this be classified under HIPAA?
PAUZAUSKIE - That's a great question. That's why we're here. It's because they're these. It wasn't an intentional oversight, either. We're not saying that when HIPAA was written or when any of these state privacy laws were written, that neural data was an oversight. It's just such a new and emerging technology that they just weren't aware of at the time that these laws were written.
MOORE- Okay. Thank you.
FARLEY-BOUVIER - And I'll just also say around HIPAA because I think that comes up a lot when we talk. I think thank you for saying wellness. There's a lot of health space that we think of health information. It's really health insurance, right, that HIPAA covers. If it's covered by insurance, then that's what's really protected, not actually your health information. So thank you very much. Any else? Okay. We're gonna now sure.
SPEAKER7 - Sure. Of course. Thank you so much.
FARLEY-BOUVIER - We're gonna now bring up Christy Lynch from the AFL CIO. And on deck will be, virtually, Robert Okanowski. I never have8648 to say his last name. We just call him Bob o.8650 Okay? Chrissy, thanks for joining us here today.
CHRISSY LYNCH - MASSACHUSETTS AFL-CIO - HB 78 - HB 86 - SB 197 - Thank you. Good afternoon. Chair Farley8656 Boubie, Chair Moore, and members of the joint committee, thank you for the opportunity to testify before you today about an issue threatening workers across the country: data privacy in the context of rapidly evolving technology that knows exactly how to squeeze the most possible out of workers and consumers. My name is Christy Lynch. I'm honored to be the president of the Massachusetts AFL-CIO, representing over 800 local unions and nearly half a million members from all sectors of work across the state. I am here today to testify in strong support of H 86, S197, the Location Shield Act, and H 78, the Massachusetts Consumer Data Privacy Act. Massachusetts workers are being tracked. They are being profiled. And their most sensitive8700 data about their health, their finances, and their physical location is being sold, stored, and analyzed without their meaningful consent. In countless workplaces, surveillance is no longer limited to security cameras. It's happening through wellness platforms that monitor health, collect keystroke data, and8719 shift scheduling tools that use algorithms8721 to score employees. Workers are rarely informed about when and how this data is being used and are even less frequently given a choice about whether this data should be collected at all.
This bossware, as it's known, is becoming increasingly pervasive, and it has a chilling effect on worker organizing. Despite its benefits to our daily lives, personal cell phone location data can be used and sold to the detriment of many. Every day, unregulated data brokers buy and sell personal location data from apps and our cell phones, revealing where we live, work, play, worship, and seek health care. This is a major concern not only for our own personal privacy but for the privacy of our families when they access essential health care. These invasive practices also intimidate workers who are considering taking collective action or8769 whistleblowing around safety concerns in their jobs. Imagine a corporation buying cell phone location data8776 for their workers in order to determine exactly where and when those workers are meeting. Without strong controls on the sale of and requirements to ensure the only data businesses collect is what is necessary to do8811 the job of their business. And is8812 what is necessary to do the job of their business, and strong enforcement provisions, including rule making and enforcement authority for the AG8820 in the private right of action for consumers.
We've seen what it looks8824 like to have big tech CEOs sit8826 at the right hand of the president. Tens of millions of Americans have had their sensitive data accessed through the Doge takeover. Bad actors have been already on the prowl for sensitive data for years, and now we have a federal government willing to hand it over. It is imperative that Massachusetts acts on this. We are poised to enact comprehensive data privacy protections like those passed in Maryland. We need to strengthen our legal framework to protect workers on the job and as members of our communities where our workers live. Any consumer data privacy legislation that moves forward must include those 3 provisions in order to protect our communities from the exploitation of shady corporations. Additionally, the Massachusetts AFL CIO strongly opposes the industry-backed proposals to so-called regulate this privacy data. We heard today, and the labor movement is all too familiar with what happens when corporations are left to regulate themselves. And as we've heard today, the big tech industry tries to make their legislation seem like a good thing. But in reality, it is window dressing that leaves consumers and working people vulnerable to exploitation. Thank you for your time. Thank you for your public service. Thank you for your dedication to ensuring that rapidly evolving technology keeps evolving, that our state is keeping up. I'm happy to answer any questions.
FARLEY-BOUVIER8906 -8906 Thank8906 you so much, Christy. I'm gonna first turn to8908 the committee. Representative.
REP LOMBARDO - Thank you. You mentioned companies potentially wanting to help their employees. Were there any examples you can share where that may have happened?
LYNCH - Amazon. You know? And Amazon does hiring, firing, and discipline just based on an algorithm that has bias baked right into it. We see Amazon tracking workers when they go to the bathroom, when they drive their cars, and when they're in between shifts. So that's just 1 example.
LOMBARDO - And you mentioned bias built in. What does that mean?
LYNCH - The algorithms that are behind a lot of this data actually are put together through equations that, you know,8952 for example, we'll use Amazon as another example. Some of the applications that Amazon was getting automatically were screening women out. I would use that as an example of bias.
SPEAKER32 - Thank you.
FARLEY-BOUVIER - Anything else? Is anybody else over here on this side? Thank you so much.
SPEAKER5 - Thank you very much.
FARLEY-BOUVIER - Thank you for your service. Good to get the perspective of the worker. Okay. Bob O, are you online? And then we have Alexander8984 Matthews from Digital Force, who will be next. Thanks so much, much, Bob.
ROBERT O'KONIEWSKI - MSADA - ( R ) - Anytime. Well, thank you. I appreciate the graciousness that you've played out there today. And met Senator Chairman Moore, through to the members. Bob Okaneski, the Polish guy, executive vice president and general counsel for the Massachusetts State Auto Dealers Association. We represent the 427 franchise car and truck dealers across the Commonwealth. We have over 25,000 employees at our dealerships. We're approximately 20% of the retail economy in Massachusetts. We sell, right around 300,000 new cars per year and an equal number of used cars per year, give or take, up or down 20,000 a year. So our economic activity is very well founded. We also, as part of our dealerships, service and repair, diagnose and repair vehicles on behalf of the consumers. So you can imagine that there's quite a bit of information that dealerships take in as it relates to selling or leasing a vehicle to a consumer, repairing a vehicle, and arranging financing for a consumer if they need it. And it's a very heavily regulated industry, whether we're talking, federal law, state law, or regulations. We do a lot with the association for our dealers in terms of webinars, compliance materials, and meetings to try to make sure9082 that they are in compliance with everything from advertising9086 laws to data protection.
The commitment from the dealership, dealership, industry is very well, founded, well written about. There's a lot that we go through. Chairwoman, you had mentioned the Gramm Leach Biley Act. Yes. It's over 25 years old, but it is also a very fluid law in the sense that it is subject to change. The dealers just went through a major re-up of compliance through the Gramlich Viola in terms of the rules that were kicked in in June '23, which we talked about at the last hearing. So we don't come here as necessarily an opponent. We think that there are some things that could be done better in terms of our industry to clarify some of the language. If you we've asked for in the past because of our, the heavily regulated aspects of the business in all realms. And we've offered up amendments to that effect as well as tweaked some of the language on the data and information carve-out that the bills have.
So it's, you know, we view the bills as, we like to sit down and talk about where some of this is going in terms of how we can play a better role in helping you all understand where we're at, both within the federal9180 scheme of things and the state scheme of things. And I would just finish up by9184 saying, you know, we're all consumers9186 as well. So, you know, none of us on the9190 dealer side are interested in engaging in behavior that would put a consumer in a bad position in relation to data and information. So we go through a lot of activities and compliance efforts at the dealerships to protect that data and information moving forward. And I know my time is up, but I would just, you know, circle back on 1 other issue that came up a couple of years ago, which is the right to repair litigation. The federal district court came up with a decision in February of this year to dismiss the manufacturer's lawsuit on the right-to-repair amendments. The manufacturers in, March, filed a notice of appeal on that. So, although the right-to-repair folks court side, there's still gonna be an9245 appeal process. And as you know, there's that type9247 of information that's in the vehicle9249 that's been the subject of some controversy9251 as that plays its way through9253 the court. I appreciate your time today. Thank you very much for coming in.
FARLEY-BOUVIER - Sure. Thank you so much, Bob. Anybody9259 on the committee? Thanks, Bob. We're moving on now to Alex Ames. Oh, I'm so sorry.9265 Who did I miss? Oh, representative Gasky. I'm sorry. I didn't see you down there. He's still with us, Bob? There he is.
O'KONIEWSKI - Of course. Okay.
REP GASKEY - Yes, sir. What type of data anonymization do you guys do with regard to, customer data that you collect?
O'KONIEWSKI - Do you have some more specific in mind? Dealers are not in the business of selling data.9292 The data that or information that vehicle, there's warranty issues, there's recall issues, there's service issues, there's, information that we collect on behalf of the registry of motor vehicles on transactions. You know, so there the dealers are not in the business, it's not part of their business model to be collecting and selling data. There's a number of forms that dealers also proffer to the consumer that explain what dealerships are going through in terms of compliance matters as it relates to information. And, again, I'm speaking of the franchise dealers. I don't represent the used car guys at all.
9341 GASKEY9341 -9341 So,9341 along those lines, when a vehicle comes in for service, how much data do you collect, and where is that data reported to? Like, do you report information directly to insurance companies or anything else like that?
O'KONIEWSKI - Well, in terms of it for, for a vehicle. So, let's say you go into a dealership to get service. So they will9369 have, they will collect and9373 retain the information for that vehicle and for who the owner of that9377 vehicle is. I don't know if you've gone through a9384 dealership in terms of getting reminders as to service on that vehicle. If there are warranty issues or9394 recall issues, there's a flow9396 of information that goes from the manufacturers and the feds, through to the consumer directly on a recall. If it's a recall, you know, obviously, the car owner can go anywhere that's a dealer for that line make to get that addressed. So, there is a lot of information that flows back and forth. The, you know, 1 of the concerns that we have that we bring up is, some of the bills, have talked talked about the, deletion of information. And what we've pointed out that if a consumer, deletes, that type of information once the service is done at the dealership, it makes it very difficult then for, a dealer in that line make, to keep track of that vehicle, for that consumer or that car owner, if you will. So, if you are, let's say, a Ford owner, you can go to any Ford dealership to have that warranty or9461 recall addressed. If the information is deleted by a consumer car owner, it would make it difficult for the manufacturer and, hence, a dealer in9473 the dealer network to address a problem that may come up with that vehicle. Does that answer your what you're looking for?
GASKEY - A little bit.
O'KONIEWSKI - What am I leaving out? Because I don't want you feeling, you know, unanswered.
GASKEY - Yeah. I well, I'm not sure that you I'm not sure that you would be able to answer my question. I'm not sure that I have my question formed exactly the way it needs to be formed, but I'm okay with everything right now. Thanks.
SPEAKER1 - Thank you, representative.
O'KONIEWSKI - But I would just representative. I would just offer up that there if there a, you know, further issues you wanna do a deeper dive on, I'd be more than happy. We can meet at a dealership, and we can9513 walk through those types of scenarios to get a better sense of just what's going on at the dealer level.
GASKY - Okay. That sounds good because the amount of information that is collected by vehicles these days is astonishing when you actually dig deep into it. And, it does, at higher levels, get a lot of data transferred over to insurance companies, and, affects pricing in different areas of the country. So
O'KONIEWSKI - Well, you know, I'm glad you brought that up. I'm glad you brought that up because the chairwoman was very active on this matter a couple of years back. There is a lot of information that is collected in the vehicle that has nothing to do with the dealers. And, you know, 1 of the things that, happens on the insurance side, if I recollect correctly, from what I've been told, is that if there is an accident or or some sort of reporting that, a consumer or car owner needs their vehicle addressed, the insurance companies9580 out there downloading data out of9582 the, out of the black box and other mechanisms in the vehicle. So they are getting extensive data on vehicles that is, separate of what a dealer may need information for, to address a consumer issue. And it goes for the same goes for the manufacturers. That's 1 of the disputes that, that, the manufacturers are in with the aftermarket folks and the right to repair folks is just what information is in the vehicle that's personal, that is sensitive, that, may not have anything to do with the diagnostic9622 and repair activities at a dealership, which the independent repairers are are, have access9630 to under the, the right to repair laws that have been passed in the commonwealth.
FARLEY-BOUVIER- Okay. Thank you. And in the interest of time, I'm gonna let the9639 representative and you take this offline.9641 We still have several more people to get through today, several dozen.
O'KONIEWSK - Thank you, madam chair. Appreciate9647 it.
FARLEY-BOUVIER - Great. We're gonna now move to Alexander Matthews, and then on deck, Doctor, Nick Dadek. Thanks for being here.
MATTHEW ALEXANDER - ACLU - Chair Farley-Bouvier, Chair Moore, and members of the committee, thank you for the opportunity to talk to you today. I am the co-chair of the volunteer civil liberties organization Digital Force. I and my colleagues who9675 are all testifying here today, Julie, Tasha, Alex, and Karen, we're not here because we're paid by big law or by big privacy or9685 by George Soros or anyone else. We are here because we're worried. We're not in control of our data, and we need your help. I'd like to speak briefly regarding the testimony of Mr. Kingman of the State Privacy and Security Coalition. There's a political article just about him, and about the activities of his coalition and how he goes from state to state, putting out the same talking points to a range of small businesses who then communicate with representatives and with governors. and9725 give the impression that there is broad-based opposition to strong privacy bills. He doesn't find grassroots opposition to privacy bills. He astroturfs it.
HP 80, which he supports, is not consistent with GDPR's opt-out approach to privacy. Mister Kingman and TechNet and others take a race to the bottom, where consumers will continue to be unable to control their data. A private right of action is not for the fun of it. I'm not gonna make a dime from it. Only a meaningful threat of litigation holds corporations accountable for whatever other rules you wanna pass. The AG's office already doesn't intervene in half of the responsibilities it already has, recent abduction by ICE of Rumeza Ozturk, the trust grad student who wrote a pro divestment op ed. To pick her up, I didn't go to her apartment or to a PhD seminar. No. They picked her up on the street and likely used her location data to do it. They probably bought it from a data broker, maybe even a data broker represented by 1 of the industry spokespeople who have testified today. In her name, in these discussions, you could ask whether any of the corporations represented by lobbyists here today sell location data to ICE and facilitate what the Trump administration is trying to do. It is dangerous to allow a free-for-all. We support strong and broad privacy protections, such as those in the bills the chairs have proposed, including a private right of action and appropriate thresholds to protect small businesses in our towns and cities. Thank you.
FARLEY-BOUVIER - Thank you. Appreciate you being here. Can we bring up Doctor Adenekan Dedeke, please? And then, on deck is Nancy Brombeck.
ADENEKAN DEDEKE - NORTHEASTERN UNIVERSITY - SB 45 - Chair Moore, Chair Fraley Bavier, and distinguished members of the Joint Committee on Advanced Information Technology, the Internet, and cybersecurity, Thank you for the opportunity to testify this afternoon on bill 45, an act to establish the Massachusetts Data Privacy Act. My name is Adenekan Dedeke. I have to pronounce it as a cross culture, so that's a good way to pronounce it. I'm a professor of management information systems at Northeastern University. I've been teaching at the college level for over 2 decades. I teach technology, and I'm interested in ethics and privacy on the other side. So I'm kind of a person in the middle that I know I want technology to work very well, but I'm also very, very aware of the risk that it plays. And it's kind of very interesting to tease that out with my students and in my research. I have personal reasons for supporting the enactment of Bill 45. Personally, I've been affected by several of the data breaches that have been reported in the Massachusetts media, from the Target to the TJX data breaches. In the past 5 years, I have received data breach notification letters from third parties. Most of them have never had any business with them. That gave me the somber awareness that my data is being shared and sold without my consent.
Also, during COVID-19, I had a rude awakening when I found out that someone had applied for a million-dollar COVID loan in my name. And I happened to be looking for a house during that time when someone told me, oh, you have a business? I said no. I don't have a business. I'm a poor9959 professor. And he told me the details of how someone had applied COVID loan. So I was very fortunate to be able to stop it. So, I have very strong reasons to want a ATA privacy law. Personally, in my research life, what I do is I9977 review the strengths and weaknesses of national security standards. I've also9981 done some things about looking through privacy laws, how they're structured. I've looked at the GDPR. During my research, I came across bill SB 45 last year. I reviewed it, and I felt that it was very well crafted. 1 privacy law that I could get behind. I was pleased to spend several hours reviewing the bill and sending my feedback and comments to Senator Moore's staff. To my surprise, they got back to me on it. I'll just say, oh, that's saying it's gonna be a back home. And it's been very, very encouraging just to send them this what I feel, this what I know, and for them to be very, very receptive to it. And it gave me and also reading the bill gave me an idea that the people who were doing the bill necessarily communicated with people that they're trying to manage or control, which I appreciated.
There are a few elements that I think make the bill stand out in my view. I think that data minimization is aery, very crucial, and I think there are some issues related to it that I think the con the senators and house speakers should also consider. But I think it's very, very important, and I think it's a strong suit of the law. I think the clear affirmative consent and revoking consent is very, very powerful and necessary for the law. Also, I think it's very important the idea that we require the people to communicate the request of the data subject to everybody. And I think that's very, very important because they have to communicate it. Other views, than what other people have. I do not claim that a bill is perfect. Nothing is perfect. I think as moving to the future, that we need to adjust it a little bit here and there, but I think it's a very, very solid, law, a solid bill that is needed, and that'll be very, very helpful. And I think it's a good thing to see that the senators in the house look into it. Again, thank you for the opportunity to testify this afternoon. I urge the committee to favorably report Bill 45, and I'll be happy to take any of your questions.
FARLEY-BOUVIER - Thank you so much. Appreciate your testimony today. I think10117 we're gonna now move to Nancy Brombeck of the League of Women Voters. I'm sorry. Oh, no. Okay. Okay. Then, on deck, we have a panel led by Carrie Rich Jones. So, Nancy Braubek.
NANCY BRUMBACK - LWVMA -HB 86 - SB 197 - ( R ) Good afternoon, Chair Farley Bouquet, the chair M,oore and members of the committee. Just make sure you can10142 hear me.
FARLEY-BOUVIER - Yes. Are we good? Coming in loud and clear, and your video is also great.
BRUMBACK - I'm Nancy Brumback. I'm a legislative advocate for the League of Women10154 Voters of Massachusetts, and10156 I'm representing our 44 local leagues in 55 communities from Cape Cod to Berkshires. The league urges you to act swiftly on the Location Shield Act, bills HB 86, SB 197 to ban the sale of cell phone location data in Massachusetts. The league strongly agrees with the advocates who are making arguments today on behalf of domestic violence survivors and people seeking reproductive health care and immigrants, but we think there are other groups that need to be sure their location data isn't private as well. The current political climate has encouraged organized groups and so-called lone wolves to threaten and harass individuals they10206 disagree with. Access to their target's location data provides them with additional ways to do that. Judges, doctors, journalists, and even election workers have been threatened and harassed just for doing their jobs. Public figures and politicians become targets based on the positions they take. You may have experienced that.
My own local league received a request last month from a candidate in our town10240 election to remove10242 their home address from our10244 league's voter guide. We removed all the candidates' home addresses. And there's a bill that was10252 filed this session to end10254 the practice of printing the home addresses of candidates on ballots. This is the world we are living in. People who exercise their first amendment right to10266 free speech by participating in rallies and marches for various causes are now targets. Extremist groups make and distribute lists of people who demonstrate for causes that those groups disagree with. The private cell phone location data of people in all these targeted groups must be protected. The location shield act favorably and quickly out the location shield act favorably and quickly out of committee and bring it to the floor for a vote. It almost passed last session. Let's not delay any longer. In the meantime, you might want to go to the settings on your own cell phone and switch the location button to off. Thank you, and I'll take any questions.
FARLEY-BOUVIER - Thank you, Nancy. Appreciate your testimony. We're now gonna go to a panel that is online, led by Carrie Ridge Jones. Oh, you're here. We're here in person. Yay. Bring your panel up, Gary. And as10351 you're settling down, I'll just let you know that Elizabeth Mahoney is on deck.
CARRIE RICHGELS - PPAF - SB 29 -HB 104 - SB 45 - Good afternoon. And10361 thank you, Chair Moore, Chair Farley Bouvier, and members of the committee. My name is Carrie Riggles, and I'm the policy manager at Planned Parenthood Advocacy Fund of Massachusetts. And I'm joined today by Doctor Kendra Harris and Megan Donnelly. We're here in support of several data privacy bills, an act to protect location privacy, H 86, an act establishing the Massachusetts Data Privacy Act H 104, S 45, and S 29, and an act establishing the Massachusetts Consumer Data Privacy Act H 78. We appreciate the urgency that the committee is demonstrating by hearing these bills so early in the session. The threats that they address are more and more pressing each day. Planned Parenthood League of Massachusetts is the Commonwealth's largest freestanding sexual and reproductive health care provider.
Across 4 locations and via telehealth, we offer birth control, STI testing and treatment, life-saving cancer screenings, abortion care, and gender affirming care. We strongly believe that everyone should have access to10435 this safe, compassionate care free from stigma and judgment. Each year, PPLM provides abortion care to more than 9,000 patients, with more than 50 percent choosing medication abortion. Location data that shows access to these services puts our community at risk. PPLM has been a target for a long time, but the current attacks make it even more important to create and strengthen protections in Massachusetts for our providers, our staff, our community supporters, and most of all, our patients. Fear of violated privacy should not keep anyone from seeking legally protected health care. While our organization is particularly concerned about location data, we also support broad, robust data privacy legislation. So, please consider us a resource and a partner in your efforts to move policy that prohibits the sale of location data and grants these bills a favorable report.
SPEAKER1 - Thank you. So much.
DR. KENDRA HARRIS - PPLM- Good afternoon, and thank you for the opportunity to testify today. My name is Doctor Kendra Harris. I use sheher pronouns, and I live in Jamaica Plain, Massachusetts. I'm a board-certified OB-GYN and complex family planning fellow at Brigham and Women's Hospital.10516 In my current10517 fellowship, I provide abortion care10519 and contraceptive care at Planned Parenthood League of Massachusetts. I'm also a fellow with Physicians for Reproductive Health. I'm here today to10527 speak on my behalf and to voice my strong support for data privacy protections. These laws will improve patient access to health care, and I believe are essential if Massachusetts wants to truly be a safe haven for reproductive health care.
My patients are very concerned about their health data, and reproductive health data in particular, criminalization. I'll bet most of us in this room can't remember the last time10551 we went literally anywhere without our cell phones. Last of all, a medical appointment where we might need to call someone for a ride home, look up old medical records, or update a loved 1 on how the appointment went. But many of my patients have opted to leave their phones at home out of fear of their location being tracked and shared. Just a few weeks ago, I had 1 of my patients had an old flip phone. She told me she'd gotten it to just coordinate her medical care and communicate with her friend who had dropped her off and would pick her up. She took all of these extra steps so that there would be no paper trail or location data to trace her back to our health care clinic.
That is how scared people are of being tracked, harassed, or investigated. Just for getting health care that is safe, essential, and normal. And this fear is not hysteria. It is warranted. Trust and security are fundamental to me being able to provide health care to my patients. It is unfair to add this layer10604 of anxiety for folks who are just trying to navigate their own survival and care for their families. It is unfair for providers to be made unwittingly into sites of fear and criminalization. The hostility toward sexual and reproductive health care across the country makes the need for stronger data privacy laws more urgent for providers like myself, too. I am incredibly proud of the work I do, and my commitment to my patients and their right to decide what's best for their bodies is unshakable. Because of my physical appearance and the work I do. To protect myself and my family, I do not post my family on social media. I use a P.O. Box for mail, and I use DeleteMe, which is an Internet data privacy service. Despite all of these steps, my location data being available still leaves me10657 vulnerable. I urge you to support10659 this bill without hesitation and give it a10661 favorable report.
SPEAKER1 - Thank you, doctor. Great.
MEGAN DONNELLY - PLANNED PARENTHOOD PATIENT - I really appreciate your attention for the next 3 minutes. I know it's been a long day, so thank you for listening to our panel and our testimonies. My name is Megan Donnelly. I am a resident of Cambridge, Massachusetts. And today, I'm here as a Planned Parenthood patient and an advocate personality trait for me. I have visited over 42 states and 15 countries, and I'm always very proud to announce this state as my home. I grew up on Cape Cod and was raised there until my parents moved to Florida my senior year of college to support my dad in fighting his battle with stage 4 non-Hodgkin's lymphoma.
I became a resident of Florida when I moved in to care for my dad in the final years of his life and traveled quite pregnant, excuse me, quite frequently, between here10722 and there. In February, I found out I was pregnant. I felt very lucky to be in Massachusetts at that time. The decision to have an abortion was easy 1 for me, and I knew that Planned Parenthood would provide me with the care and kindness and anonymity I desired. Had I been in Florida, where my job and health insurance were based, I worried about what it would mean for me to go back to a state that might be on the brink of criminalizing abortion. Had I been in Florida, I firmly believe that even then, I would have come back home for this10753 care. The procedure was my choice. It's 1 of a series of millions10759 of choices we make in our10761 lives, from trivial choices, like what outfit to wear or what to eat for dinner, to big ones, like if I10767 could be a mom or if I wanted to be a mom. All of these choices, big and small, are mine. They are mine to hold. They are mine to choose, and they are mine to share.
The day that I got the blood test confirmation that my medication abortion had been successful was 06/24/2022, the day the Supreme Court overturned Roe v Wade. The relief I had at first felt was overwhelmed by knowing that millions of women, especially my friends and family in Florida, were now terrified about what this would mean for them. I had to watch everywhere on social media, on the news, on the street, what everyone thought about this choice that I had made even though they didn't know I'd made it. I chose then not to share. I will tell you why I am choosing to share today. A report last year revealed that location data that included people's visits to nearly 600 Planned Parenthood locations, including in Massachusetts, was sold and used to target people who visited Planned Parenthood in a nationwide anti abortion campaign. There are people all over the country who think they have the right to police our bodies, my body, and the data collected on our phones that track our movements to help them do that. Let's make it a lot harder for them to do that. Let's make Massachusetts a state where no 1 can access your location data without your consent, where the only way you will know my choices is because I sit in this room and I choose to share them. Let's continue to be proud of this place that we call home, and I urge the committee to grant this bill a favorable report. Thank you.
FARLEY-BOUVIER - Thank you. Thank you all for your testimonies. I do have a question, Doctor. So you mentioned that there you know a patient or maybe maybe more than 1 who leaves their cell phone at home, right, to protect, them from, you know, having a location generally. So if that same patient and we can substitute in the clinic for a gay bar, a synagogue, any place, any mental health facility, anything like that. But let's you you brought10891 up this example, so I wanna use it. How did they figure out where the clinic was?
HARRIS - So in this case, you know what I'm saying? Right? This was a patient that came out of state,10904 and they were with a friend that10906 was familiar with the location of the clinic. But,10910 I mean, I use my Google Maps or my everyday to get to work even though I go there all the time.
FARLEY-BOUVIER - So, but could they have Googled the address?
HARRIS - Not with the phone that they had.
FARLEY-BOUVIER - Okay. But you could before they arrived. Did they Google where to go? Yeah. Did they say
HARRIS - Yes. They did.
FARLEY-BOUVIER - Is it safe to have an abortion? Is it too late to have an abortion? They could have Googled all that information.
HARRIS - They did a lot of searching in order to get that, yeah. Essential health care to get to Massachusetts.
FARLEY-BOUVIER - So I just wanna clarify that the Location Shield Act is good only if you're tracking the exact location at that time when you go.10946 But all the other ways that you get the information around where to go, trying to decide what you're gonna do, all that information is not covered in patient10956 feedback. So I just wanna clarify that for everybody.
RICHGELS - I absolutely agree. And the other example that we used, it comes traveling from out of state, from Florida, from the state where these patients came from, all of the steps that they had to take online to figure out, you know, to buy a plane ticket, to search where to go. All of those other pieces of information, you're absolutely right, would point to their would point10984 to their medical decisions, which are private.
FARLEY-BOUVIER - Yeah. Thank you. I really do appreciate all of your testimony. Megan, thank you10992 for sharing your story. I know10994 that it's not easy to do, but you've taken10996 the decision to do that in order to make a change. It's very empowering. Thank you very much. Thank you. Okay. So, Senator.
MOORE - It's actually not a question, Megan. I just wanna thank you. It's very hard for people to go up and testify in front of strangers, and also, virtually, hybrid. But I just wanna thank you for coming to share, strangers, and also, virtually, hybrid. But I just wanna thank you for coming, sharing your story, and letting us know what you went through and what we should be looking out for in other people. So thank you.
11026 FARLEY-BOUVIER11026 -11026 Thank11026 you. We now have Elizabeth Mahoney from the Mass High-tech Council, and,11030 on deck, is Bruce Schneider. Schneider. I'm sure I got that 1 wrong. Sorry. Nice to see you, Elizabeth. Thank you.
ELIZABETH MAHONEY - MASSACHUSETTS HIGH TECHNOLOGY COUNCIL - And if you love me waiting. Thank you to the chairs and the committee for the opportunity to testify. My name is Elizabeth Mahoney, and I'm the vice president of policy and government affairs for the Massachusetts High Technology Council, the Commonwealth's oldest cross-sector association of CEO level leaders of technology, professional services, and research institutions. The council and its members are very engaged in the issues addressed by a number of the bills before you today. The council's mass vision11072 20 50 initiative is a multiyear effort to bring together private, public, and academic leaders with an eye towards supporting the key sectors that are likely to drive11080 employment and economic growth in Massachusetts in the coming decades and ensuring Massachusetts is a leader in these sectors. MassVision 20 50 focuses on 9 sectors, including artificial intelligence and cybersecurity. The council is grateful to the legislature for supporting a hundred million dollars in funding for applied AI in last year's economic development bill, and we and our11099 members have been actively engaged with the Healy administration as they establish a mass AI hub.
The council has also established a cybersecurity community with our members, an opportunity for member11109 CIOs and cyber leads to share best practices and learn from other content11113 experts. We're pleased to have the chairs join us to share their expertise with this group last year. The council shares the legislature's goal of providing robust consumer protections and appropriate guardrails around new and emerging technologies. We will provide more detailed thoughts on the bills for you in our written testimony, but today, I'd like to offer some overarching thoughts on how to achieve these shared goals while also supporting the growth and innovation of these industries. 1 of the challenges of regulating areas like cybersecurity and artificial intelligence is that the technology, the landscape, and the potential dangers change so rapidly. While it makes sense to put core requirements and frameworks in statute, the council appreciates that several of the bills before you establish entities within state government charged with developing detailed rules and regulations around AI and cybersecurity standards. This will allow the state government to be nimble and responsive to changing needs and circumstances.
Including representatives from impacted industries on these boards also provides an opportunity for subject matter experts to ensure any impacts on these industries are carefully considered. The council also appreciates the recognition that smaller businesses may have different abilities and challenges when it comes to implementing cyber security measures. Another challenge for the state regulation of AI and cyber, and data security is situating this regulation within the existing federal and other state frameworks, with which companies must11195 comply. To make it easier for11197 businesses to comply, we do support efforts to align with existing standards and11201 best practices to the extent possible and to avoid duplicative reporting requirements for certain cyber incidents when appropriate. While there are many challenges to consider with these new technologies, there's also an opportunity for Massachusetts to be a global leader in the innovation and development of these sectors while also protecting consumers. We will be submitting written testimony with more detailed input, and the council welcomes the opportunity to continue to engage with this committee on these topics and share the insights and recommendations of our members.
SPEAKER5 - Thank you, Elizabeth. 2 seconds.
FARLEY-BOUVIER - You did really well practice or something. So we shifted a little bit talking about the sec security bills, but we see how this all comes together. But I just wanna check. Senators, are you good? Anybody else on the committee? Thank you.
SPEAKER7 - Thank you. Thank you so much.
FARLEY-BOUVIER - Looking forward to continuing to work with you. Absolutely. We now11248 have Bruce Schneier. And, up on deck, virtually is Haley Tukayama. Bruce here? No. Okay. So, Haley, I didn't give you a very long heads up that you're coming up. Are you there, Haley? And then we have Jonathan Cohen after Haley.
SPEAKER43 - I am here. Hello.
FARLEY-BOUVIER - How are you? Terrific. Thank you so much.
HAYLEY TSUKAYAMA - EFF - HB 86 - SB 197 - ( R ) - Thanks for allowing me to speak today. So, good afternoon. My name is Hayley Tsgayama. I'm speaking today on behalf of the Electronic Frontier Foundation, or EFF, in strong support of H86 S197, the Location Shield Act. EFF is a nonprofit, nonpartisan, 35 year old digital rights organization that fights for civil liberties in the digital age. We have more than 30,000 supporters across the country, including in Massachusetts, and we advocate for policies that protect people's privacy and encourage innovation. Thank you all for your commitment to enacting robust, meaningful data privacy legislation to protect all people in Massachusetts from inappropriate surveillance and exploitation of their personal information. We deeply appreciate the leadership the chairs have demonstrated by recognizing the need for strong consumer data privacy laws. The Location Shield Act would ban the sale of all cell phone location data in the Commonwealth with reasonable exceptions. This is a crucial privacy protection. There are many reasons why location privacy is particularly important.
Where we go says a lot about who we are. Location information can reveal where we live, where we work, where our children go to school, what organizations, support services, and businesses we may frequent, and where we worship. It's very easy to re-identify people based on their location information, particularly if the information set captures habitual trips, such11352 as commutes. Corporations eye this information for profit, disclosing huge quantities of often sensitive, granular data about all of us11360 in ways a normal person would never expect, and they sell it to whoever can pay them for it. Companies have also shown that despite the clear risks that the sale of this information can present to individual people, the business motivation of profit outweighs the willingness to change any practice.
That's why it's important for the legislature to weigh in. This vulnerable population such as immigrants to this country who may be rightfully wary of location tracking happening as they go about their daily lives. The same may be true of domestic violence victims who11391 wish to conceal their location or people who are merely attending protests or rallies. The truth is that everyone11397 deserves location privacy. We all have moments in our lives where our movements make us vulnerable. No 1 wants to have all of their movements tracked, and no 1 should have to worry that information about their daily lives will be weaponized against them. The Location Shield Act is a common-sense measure that protects this data from unexpected sale while also letting consumers elect to use technologies that require location data. Thank you for your attention and your consideration.
FARLEY-BOUVIER - Thank you, Haley. I'm gonna turn to the committee, see if there are any questions. Appreciate your perspective. We're now gonna move to Jonathan Cohen from Progressive Massachusetts, and on deck will be Don Bell.
JONATHAN COHN - PROGRESSIVE MASSACHUSETTS - S197 - HB 86 - Chair. Good? Okay. Awesome. Thank you so much. Chair Farley-Bouvier, Chair Moore, and members of the joint committee on advanced information technology, Internet, and cyber security. My name is Jonathan Cohn. I'm the policy director of Progressive Massachusetts, a statewide advocacy, grassroots advocacy group fighting for a more equitable, just, sustainable, and democratic commonwealth. As many others have done before, we urge you to give a favorable report to S 197, which enacts to protect safety and privacy by stopping the sale of location data, and11481 H 86, which enacts to protect location privacy. There were11485 many other excellent bills that people testified on today, and so I'll give a shout-out to those of the11489 many steps to support privacy rights. To give a recent example, this Saturday, tens of thousands of Massachusetts residents rallied to protest, kind of the chaos, cruelty, and encryption of the Trump administration. That may include some of you. Here, if you attended a rally, you know when you arrived and when you left and where you went next. Your friends and family might not know that too if you texted them. But do you know who doesn't need to know that information? Somebody like Elon Musk, bad actors seeking to surveil, protesters exercising their first amendment rights, the list goes on.
As folks before me have noted, right now, there's there are no laws preventing anyone with a credit card from purchasing cell phone location data. The purchase and sale of that data can empower bad actors. Right-wing extremists seeking to target individuals seeking abortion care or gender affirming care, domestic abusers seeking to track their victims, predatory bosses seeking to spy on their employees, and the11545 list goes on. And by attacking privacy rights, it also weakens the basic rights11551 of free expression and dissent in democracy. We have already seen, as folks before me have noted, that the Trump administration detained and threatened to11559 deport students merely for the act of attending protests, and they are not subtle about their desire to ramp up targeting and to target citizens as well. And we should not be giving them any more tools to do so. Azure Chamber deliberates on our Commonwealth's response to what's happening in DC. We11576 urge you to make this bill a part of it. 2 50 years ago this month, Massachusetts was the site of taking a stand against the abuses of civil liberties, and so that's a legacy that we should continue. So thank you again for11588 being here this afternoon.
FARLEY-BOUVIER - Thank you, Jonathan.11590 Any questions from the committee? We appreciate it.
COHN - Thank you so much.
FARLEY-BOUVIER - Okay. So we understand Don, isn't able to wasn't11600 able to stay. And so we're gonna go to Amy Van der Heil, who's online. And11607 then up next is a panel led by Mel De Silva. Amy, are you with us?
SPEAKER15 - I am. Thank you very much.
FARLEY-BOUVIER - Excellent. I hear you well.
AMY VAN DER HIEL - CONCERNED CITIZEN - HB 86 - ( R )Thank you very much. Thank you to the members of this committee. My name is Amy Van Der Hiel, and I live in Maynard, Massachusetts. I'm here in support of Bill H86. I'm testifying in support of this bill to prohibit companies from selling our location data. It's an urgent and powerful opportunity for our legislators to step forward to protect the privacy of the people of Massachusetts. Although I'm speaking as a private citizen, not from my employer, I've worked in the Web technology field for more than 20 years. We've all seen incredible innovations like the web and smartphones in the past decades and how they've all changed our lives, enabled us to access much of the combined knowledge of the world, and be connected in ways never before imagined. It's even allowed us to participate in democracy in new ways, like me being able to attend this hearing and testify to you now, for which I'm grateful. However, just as these technologies11675 allow us to connect in wonderful ways, we've sadly learned over and over that they can be misused to take advantage of us or even11682 in dangerous. As has been noted before, privacy is a human right. In 2019, the U.N.11688
Human Rights Council declared that the same rights11690 that people have offline must also be protected online, including the right to privacy. However, these technologies we use have become so complex that it can feel almost impossible for us to protect our data, even as individuals, since location and identifying data can be gathered from so many different sources, including GPS, Wi Fi, Bluetooth, innocent seeming apps, and even motion sensors on our phones. As so many people have said, data about where we go and whom we see should not be for sale to the highest bidder. Without such a law that protects our data, Massachusetts residents and visitors face growing new threats to our fundamental freedoms and our personal safety, as so many here have so well described. This bill strongly protects us, and it's more than fair in that it simply requires companies who want to use our data to get our consent first, and it still allows for positive use, such as during emergencies. I ask all of you to please pass this important bill to protect the people of Massachusetts. I want to thank the committee and its chairs, Senator Michael Moore and Representative Tricia Fairley Bovier, as well as the sponsors and cosponsors of the bill, which I'm proud to say include my own Senator James B. Eldridge, for all your work to stand strong and to fight to protect the liberty, freedom, and privacy of the people of our state. Thank you for your public service and your dedication to the people of Massachusetts.
FARLEY-BOUVIER - Thank you so much, Amy. To the committee? Anybody? Thank you. We're now gonna move to a panel led by Mel DeSilvia. And on deck is another panel led by Cindy Roe. Do we need another chair pulled in? I have another chair. Okay. I'll be welcome. I'll get you both. Do you care who starts?
SPEAKER36 - Do you care who starts?
FARLEY-BOUVIER - No. I do not care who starts.
SPEAKER16 - Alright.
FARLEY-BOUVIER - You guys decide11808 amongst yourselves. Cool. Yeah. We sort of did. Okay.
ARLENE ISAACSON - MASSACHUSETTS GLBTQ - HB 78 - SB 45- The, good afternoon. For11814 the record, my name is Arlene Isaacson. I'm testifying today in my capacity as the co-chair of the Massachusetts GLBTQ political caucus, formerly the mass gay and lesbian political caucus. I wanna start by thanking the chairs for filing strong, terrifically important privacy legislation, data privacy legislation, House 78, Senate 45. My testimony today will be in support of those bills and the other data several other data privacy bills, but I'll be focusing on location privacy in particular because of the shortness of the time, of course. So House 87 and Senate 197. But I don't want that to overshadow our support for the chair's bills, as well as Senate 29 and House 104. I'm speaking specifically from the vantage point of the LGBTQ community. Many people in our community, many folks in the11868 LGBTQ community, in this world, don't feel particularly safe. Lots of us are not yet out of the closet. I shouldn't say us in that case, obviously. But there are plenty of plenty of my people that feel they are afraid to come out because they're afraid of losing their jobs or their family connections.
Too few people, of course, realize that many of our apps, as you've heard, sell our location information, as you know very well. And they sell them to anyone from our nosy neighbor or our inappropriate neighbor to our employer or worse, in our case, to extremists who believe that our very existence is a scourge upon the universe. So, the LGBTQ folks, that can be terrifying. It exposes us to harassment. And, candidly, it's downright creepy. We already know that the right wing that right-wing groups have used location information to out gay men and cause them to lose their jobs. As for the trans community, my colleagues will cover that, so I won't do it here, lest it be more redundant. But I will speak on a personal level. As some of you who know me know that I've been doing LGBTQ advocacy for 3 decades. Many11945 times during those years, I received11947 threats from anti LGBTQ extremists that they would kill me or do this to me or that to me or hurt my family.
The threats were commonplace. And the threats were bad enough that after 911, during the anthrax scare, my then partner asked me to not stop opening our organizational mail, our snail mail, at our home in case there might be white powder included in them. Which, as you may recall, was a fear back then. Now, I was able to protect myself somewhat back then. I never listed my address or my phone number publicly. I lived in a secure building with strong locks on the doors and the windows. I can't imagine what I'd have done if some anti LGBTQ hater, perhaps from Texas, had been able to track my movements, know where I live, where I shop for groceries, where I worshiped, or where I pick my kids up from school every day. That would have been not just creepy but12004 downright dangerous. So I ask you to help protect us by limiting access to location information, by banning its sale, and by providing privacy protections for other kinds of data as well. Thank you. Thank you.
MEL DE SILVA - TRANS HEALTH - Thank you for the opportunity to testify this afternoon. My name is Mel De Silva. I use them pronouns. I am the director of development and communications at Trans Health, and I am the12031 proud parent of a trans kid. Located in North Hampton, TransHealth is the only independent nonprofit12037 healthcare organization in the12039 United States solely devoted to serving the trans and gender diverse community. Transhealth supports comprehensive data privacy legislation, and today, I will speak specifically about location data privacy as it relates to gender affirming care providers. This legislature has worked tirelessly to make Massachusetts 1 of the safest states for the trans community. However, given the unrelenting federal attacks on trans people, there is an urgent need to take additional legislative action to protect the residents of the12072 Commonwealth. Delivering gender affirming care is increasingly dangerous work. Nationwide, 70% of gender affirming care providers surveyed reported receiving threats, and that was in 02/2023.
So if we redid the survey this year, I think it would be a lot worse, actually. Adding to that risk, it is currently legal in Massachusetts for bad actors to purchase location data to track the movements of anyone who works in the field of gender affirming care. This could be health care providers, nurses, medical assistants, medical receptionists, and even researchers. With this data, anyone with an intent to cause harm and commit violence could pinpoint the locations of any trans health employees, including where we live, putting us and our families at risk. Without a ban on location data purchasing, the expansion of the state address confidentiality program to include those working in gender affirming care has a limited effect and12129 offers little security. We need this additional protection. It is critical that legislature act quickly to pass comprehensive data privacy legislation that includes banning the purchase of location data, which can be used to harass, stalk, and commit violence against those who care for and support the trans community. I would like to thank this committee again for the sup for the opportunity to testify today on this important issue. And, And if12155 you have any questions, I can answer them. Thank you.
SPEAKER1 - Thank you so much. Continue with the panel.
KELSEY GRUNSTRA - MASSACHUSETTS TRANSGENDER POLITICAL COALITION - HB 86 - SB 197 - Hello there. I am Kelsey Grunstra, and I'm the deputy director of the Massachusetts Transgender Political Coalition, the oldest trans, transgender advocacy and community development organization in the United States. Thank you so much to the committee for hearing my testimony today in support of the Location Shield Act, H 86 and S 197. Although there are many reasons to focus on location privacy, it is especially urgent for vulnerable, multiply marginalized communities, such as transgender, nonbinary, and gender diverse people. Trans people already navigate the world facing more risk than our cisgender peers.
Many of us must contend with family members, current and former partners, and others in our lives who refuse to respect our identities and may want to perpetrate harm against us. The open availability of location data to brokers and other third parties makes it easier for stalkers, abusers, and other would-be perpetrators to track, harass, and persecute us. Moreover, location tracking can also reveal when a trans person has access to gender affirming services, support, or health care, which can be increasingly risky if they are trying to avoid prosecution or harassment because of it. This is particularly salient as we are being actively persecuted by a federal government that is attempting to outright eradicate the right to safety and privacy of all trans and nonbinary Americans.
In particular, gender affirming health care for trans and non-binary youth is under attack. There are incredibly high risks for these children's caregivers and the providers of this life-saving pediatric care. Many people come to Massachusetts to seek gender affirming care that they can no longer access in their home states. Increasingly, they may face prosecution in other states whose activist legislatures have passed overreaching laws. Let me be clear that no 1 in Massachusetts should ever be concerned about a bounty hunter tracking them down because their location data is up for grabs online, especially not a child, a12271 loving parent, or a dedicated health care12273 provider.
Since Donald Trump's inauguration in January, MTPC has already had to shift a public name-change12279 clinic event at the Somerville Public Library to a virtual event out of concern for the safety and privacy of our clients, staff, and community partners due to hateful and threatening comments. Fear should never be a factor for why trans and nonbinary people are unable to access supportive and affirming resources or care. We must be free to leave our homes, doctors, offices, workplaces, and community spaces without the fear of being tracked or bringing any danger upon ourselves, our providers, or our communities.12309 We at MTPC are incredibly grateful for this committee's commitment to upholding the privacy rights of people across Massachusetts, including12317 omnibus privacy legislation and regulation that would prohibit the sale of location and all other sensitive private data. We strongly urge the committee to report the location shield back favorably. The people of Massachusetts deserve to use their devices without worrying about their information being sold to anyone who wants it. Thank you for your time.
SPEAKER1 - Thank you.
RAYNA HILL - MASSACHUSETTS COMMISSION ON LGBTQ YOUTH - Good afternoon, and thank you. My name is Reyna Hill. I use sheher pronouns, and I am the legislative and policy manager with the Massachusetts Commission on LGBTQ Youth. And I wanna thank you all for the opportunity to share the commission's support for data privacy protections broadly. And specifically, in this instance, I'm gonna be talking about the location shield act and location safety. And I also wanna thank you just for being here today. I know that my nightmares have gotten more specific after this hearing, and so I'm sure for you as well; this is12366 quite terrifying. It's not terrifying.
FARLEY-BOUVIER - It's not quite serving on this committee. A lot more scared.
HILL - Yeah. I'm gonna share 2 points for you today, and we'll follow up with some written testimony early next week. The first is that this location shield act and data protection privacy specifically is going to help protect LGBTQ youth and families in Massachusetts. As has already been shared and as we are all aware, the unrelenting attacks on the safety of transgender and nonbinary kids and anyone supporting them are clear. Over the last year, the commission has seen a significant increase in outreach from terrified parents who are asking how the state12403 can help protect them and their children, especially from families who have already begun to flee from12409 their led states to here in Massachusetts. We have heard from parents and youth that they are terrified to go to public events, support spaces, and find other services, particularly after we've seen the bomb threats at Boston Children's Hospital for their gender affirming care program, the violence and harassment at LGBTQ centered library events, school committees in12429 schools just in general for meetings over their curriculum and books, and, graffiti and vandalism to12435 LGBTQ spaces over the last several12437 years.
By passing these bills to provide a data privacy safety net over Massachusetts, parents of trans and non binary kids can have a little bit more breathing room to also not have to worry about anti LGBTQ hate groups, of which we have several centered and founded here in Massachusetts, using their location data to stalk and harass their families at home, at school, and at the library for just being supportive and loving parents to their children. The second point we wanna share with you is that this bill will help protect teachers and librarians. The commission has seen an alarming increase in the number of educators reporting that they are scared to go to work. They're scared for their safety because12476 of the LGBTQ inclusive curriculum that they teach and the affirming events that they hold, and the books that they display in their communities.
We have librarians describing instances of aggressive community members entering libraries and threatening librarians to remove book displays featuring books with black trans characters and teachers getting verbal and written threats to remove supportive12496 signage and flags from their classrooms. No person should be afraid to go to work simply because they12502 are trying to make their communities more safe12504 and affirming for children. We need to increase the protections for our teachers and librarians, and these bills will help do that,12511 specifically around preventing the sale of location data. The easy access to location data for Massachusetts residents12517 poses an extreme danger to the constituents, to schools, communities, parents, and kids. And it's not a matter of if people are going to use a sata, it's12526 a matter of when. And I think that's really clear. And I'll say, I12530 would say for the commission as well, just for our staff, we've gotten a lot of threats and doxing. And I'll say that just state employees staff are really scared right now, and this will help. So thank you.
FARLEY-BOUVIER - Thank you so much, for there's a lot of different perspectives on this panel, and we appreciate it. I just wanna see if there's anybody on the yes. Sorry. Representative Owens. It's okay.
OWENS - Most of the panel knows me and Steve anyway. Thank you so much for coming out today. I appreciate this. I wasn't gonna ask this,12561 but I know most of you have been here all day, and I wanted to give you a chance because we had a comment earlier about, how, the downside of some of these rules would be vulnerable populations would maybe not get the correct ads served to them, at the where they need. So I just- is that a concern that you feel members of the LGBT community might have?
GRUNSTRA - I just have 1 really quick comment, which is that, we are all, you know, in rep Meschino's, dutiful research of the prior speak, testifiers, clients, I can say that MTPC is already barred from advertising on services and platforms such as Meta, because of our of our allegiance to supporting transgender and non binary12610 individuals. We have been banned multiple times from advertising on that platform, and we are also banned from specifically targeting people based on their gender identity and12620 their race. So that fell flat on deaf ears. Thank you.
SPEAKER8 - Very interesting. Senator.
ISAACSON - May I just at some point respond to that? I'd just like to say that the LGBT community is deeply touched by the concern from the12637 advertising industry. And we appreciate that they are concerned about our ability to have ads provided to us. Nonetheless, we are willing to make that terrible sacrifice12650 in order to have privacy and protection in our lives.
MOORE - Just my quick question to anyone of you is how long has that been in place, the restriction?
GRUNSTRA - I can tell you the first time our account was banned was in 2023, and it happened twice in 2024. And we have made the executive decision at our organization that we will no longer even attempt to advertise on Meta's platforms.
FARLEY-BOUVIER - That's really helpful, and I appreciate your testimony. We look forward to continuing to work on this as these bills move forward in the process. Okay. So we have our next panel, which is led by Cindy Roe. Cindy, if you wanna bring your crew up, and then there's a panel following led by Brianna Savage. I just wanna say something that the time is getting late. This committee is here for as long as you need us. The only thing I'm gonna ask for is your consideration of the peeps other people who have been waiting. And so if we can, you know, maybe even on panels, if we cut it, like, each person down, like, a bit. We're just trying to tighten it up a little bit so everybody waiting has a chance. And written testimony is12721 awesome. So anything you wanna provide in writing is super helpful to us. Cindy, go ahead.
CINDY12727 ROWE12727 -12727 JALSA12727 -12727 Thank12727 you. Thank you to both chairs and to the members of the committee. My name is Cindy Rowe. I am the CEO of the Jewish Alliance for Law and Social Action, a membership-based nonprofit organization based in Boston with thousands of members and supporters statewide. Guided by our Jewish teachings and values, we are devoted to the defense of civil rights, the preservation of constitutional liberties, and the passionate pursuit of social, economic, racial, and environmental justice for all people. She also wishes to offer testimony in strong support of the location shield act. Additionally, she would also support more expansive omnibus legislation that would protect privacy and include the location shield act.
We know that unscrupulous and unregulated data brokers currently buy and sell personal location data from apps on our cell phones, revealing much about us without our knowledge. Moreover, these data brokers are not the only entities who procure, use, and abuse personal location data. Anybody can procure this data, including those seeking to perpetuate hate crimes driven by antisemitism and any other form of hatred. We are living in a time of heightened hatred and bigotry. From 2,022 to 2023, the number of recorded antisemitic incidents across New England skyrocketed by more than 22, by more than 200%, reaching a 40-year high for the region, according to a recent report. In Massachusetts, the number of antisemitic incidents soared from 152 in 2022 to 440 in 2023. Massachusetts recorded the fifth highest number of incidents per state in the entire country, followed only by Calif by California, New York, New Jersey, and Florida. Furthermore, these incidents touch all corners12843 of the Commonwealth. 127 different municipalities. So at least 1 antisemitic incident in 2023, and it increased from 71 in 2022.
Imagine a white supremacist group buying the location data for those who enter a synagogue or, for that matter, a mosque, or predominantly black church, or any other place of worship that serves a community that could be targeted by those who are looking to commit violent acts driven by hate. Failure to address this violation of our privacy when we have the means to protect people is irresponsible and could lead to tragedy. The voters who elected you put their12886 trust in you to enact laws12888 that protect people in our commonwealth. I can think of no better example for you to12894 live up to that responsibility than to enact12897 this bill. In doing so, you may be preventing the next tragedy from taking place here in our state. She also strongly urges the members of the joint12907 committee to favorably report this bill out as soon as possible. And to support more expansive omnibus legislation that would protect privacy and include the location shield act.
LYV NORRIS - REPRODUCTIVE EQUITY NOW - Good afternoon, chairs and members of the committee. My name is Liv Norris, and I am here on behalf of Reproductive Equity Now in strong support of the Location Shield Act. Location privacy is not just12938 a data privacy issue. It is fundamental to personal autonomy and safety and is directly connected to reproductive justice. Corporations collect and sell vast amounts of highly detailed sensitive data about all of us. And unless and until Massachusetts takes action, individuals in our communities are vulnerable to real harm at a time in our country when protecting information about your private reproductive health choices is more urgent than ever.
12967 Efforts12967 to criminalize abortion care are ever-evolving, with patients and providers facing both civil and criminal threats from hostile states. Last year, it was discovered that an anti abortion group had targeted patients from nearly 600 Planned Parenthood locations across the country with anti abortion political advertising. It's incredibly disturbing that unregulated data brokers are profiting from selling the location data of12992 abortion patients and that anti abortion extremists are among their buyers.12996 And there are bigger12998 threats on the horizon when it comes to reproductive freedom and data privacy. With a federal administration that poses an existential threat to abortion access, anti abortion politicians are emboldened to weaponize location data as a way to repress patients who are seeking abortion care across state lines.
As patients travel here for care and our shield law protects providers offering medication abortion across state lines, the significant danger of selling location data, which identifies where we live, where we work, and get health care, is clear. The Location Shield Act provides straightforward protections to give ordinary people robust location data privacy and is an essential step to ensuring that13039 Massachusetts residents and visitors are able to use everyday technologies without compromising on their committee to give a favorable report out to the Location Shield Act. Thank you for that. So much. Committee to give a favorable report out to the location show act. Thank you for that.
SPEAKER1 - Thank you so much.
NITHYA BADRINATH - JANE DOE INC. - HB 86 - SB 197 - Good afternoon, chairs Moore and Farley Bouvier and honorable members of the joint committee on advanced IT, Internet, and cybersecurity. My name is Nithya Badrinath, and I'm the policy director at Jane Doe Inc. The Massachusetts Coalition Against Sexual Assault and Domestic Violence. I'm here today to testify in support of the Location Shield Act H 86 and S 197. As the state coalition with 60 member programs that work directly with victims and survivors of sexual assault and domestic violence, I'd like to take a couple of minutes to discuss our support for these important bills. The Location Shield Act would stop the sale of cell phone location data while still allowing companies to collect and process this data with consent.
Data brokers have access to and sell vast quantities of cell phone data. Data brokers will sell this information to anyone with the money to buy it, including people causing harm, such as domestic violence, and anyone with the money to buy it, including people causing harm such as domestic violence or stalking. The location data broker business threatens every person in our state but is especially dangerous for people in abusive relationships and those targeted by stalkers. An individual's personal location information reveals the most sensitive and intimate things about them. Survivors often take careful steps to conceal their locations, but all their efforts can be undermined by the sale of location data that shows where they seek shelter and reveals their daily routines.
Last year, Massachusetts passed incredibly important legislation allowing survivors to seek protection orders for experiencing a course of control. Abusers having access to a survivor's personal data through data brokers enables a course of controlling behaviors. A common 1 is tracking a survivor's location to stalk, intimidate, harass, and even monitor and monitor a survivor. The threat of this dangerous industry that facilitates stalking and abuse is alarming, but we know that the legislature can act again this year to further support a13175 survivor's access to privacy and safety. Survivors of domestic violence and stalking, people seeking sensitive health care, and everybody in Massachusetts should have control of their own personal13185 data. We ask the committee for a favorable report of the location shield13189 act and also broad, robust13191 privacy legislation such as the Massachusetts Data Privacy Act and Massachusetts Consumer Data Privacy Act. Thank you for13198 your consideration.
FARLEY-BOUVIER - Terrific. Thank you. Any questions for the panel? We really do appreciate your testimony. Thank you very much. We have an then our next panel,13209 led by Brianna Sav13211 Savidge, and then we are going to move13213 to Sam Larson of the Associated Industry. And then we're gonna go back online to CG. In any order you would like.
BRIANNA SAVAGE - YW BOSTON - HB 86 - SB 197 - Good afternoon. Thank you, chairs and members of the committee. My name is Brianna Savage. I am the associate director of advocacy at YW Boston. I'm also a lifelong Massachusetts resident, a homeowner in North Reading, and a new mom. I am here representing myself and YW Boston. YW Boston is a nonprofit organization that works to eliminate racism, empower women, and promote peace, justice, freedom, and dignity for all. We do that through our industry-leading leadership development, consulting and training, and youth development, and through advocacy work focused on reducing barriers to equity in Massachusetts, which is why I'm here today in strong support13276 of H 86 and S197, the location shield act. YW Boston is committed to13282 upholding the rights of all citizens regardless of their background, especially of those who have been historically targeted or marginalized. As technology continues to advance with the support of private entities, we must advance our laws to protect our civil rights. We cannot let their ruthless need for profit and power outpace our fundamental right to privacy and protection.
Location privacy is fundamental to personal autonomy, safety, and the exercise of our other basic rights. But unfortunately, no current law forbids companies from buying and trading our location information. Extremely sensitive information revealing who we are, what we do, with whom we associate, what we are interested in, and how we behave is up for grabs in a multibillion-dollar marketplace. We must challenge those who wanna monetize discrimination. This data has been used to track and identify people seeking reproductive health care services, attending religious services, visiting homeless and domestic violence shelters, and going to sites that provide addiction services for treatment. Imagine the people who would likely need to seek reproductive services, be targeted for attending religious service, visit shelters, and receive treatment.
And I'm sure you can see why this is a civil rights issue. We know that women, people of color, and folks with marginalized gender identities are especially vulnerable to the sale of this data. Today, authoritarianism is a growing threat to our most cherished democratic values and civil rights. Passing the Location Shield Act will ensure that our cell phones cannot be weaponized against us. And so, YW Boston respectfully asked this committee to favorably report the location shield act, and we strongly support including a ban on the sale of location data generated from electronic devices in the Commonwealth in omnibus consumer privacy legislation, as you did last session. Thank you so much for your attention and consideration.
SPEAKER1 - Thank you. So nice to see you. Nice to see you.
FAYERUTH FISHER - JCRC - HB 86 - SB 187 - Thank you, Chair Moore, Chair Farley Bouvier, and members of the committee, for your time and attention to the important matters for you today. I will submit written testimony and try and13399 self-edit. But for the record, my name is Faye Ruth Fisher, and I'm the chief of public affairs and community relations for the Jewish Community Relations Council of Greater Boston. And I'm testifying in strong support of H 86 and S 187, known as the location shield act. Our mission is to promote an American society that is democratic, pluralistic, and just. The core of any pluralistic society is ensuring that residents can live self-determined lives with safety and meaning. S 86 I'm sorry. I'm flipping my bills. But this bill would help forward this goal by helping some of the most vulnerable residents to protect them to have the freedom and autonomy to make decisions without fear of violence, including the freedom to access health care and worship how and where they choose.
These are basic fundamental principles that have helped govern our commonwealth that we as a Jewish community have relied upon. And we are grateful for the steps the legislature has taken to safeguard these and to invest in communal security. As others on this panel will have shared, our communities are continuing to experience this rise in extremism with the targeting of worship spaces and community organizations. We are grateful that 1 such recent incident targeting the Jewish community was prevented, resulting in an arrest in Beverly and the confiscation of weapons, including ghost guns. The person had both the means and the violent intention, which is frightening and chilling. These statistics increased year over year, as Cindy Roe shared what those statistics were, but some of the ones that were shared were from the Massachusetts Executive Office of Public Safety and Security. The JCRC endorsed both of these bills in 2023 and continues to strongly support them as important and needed interventions to stop someone or people with means and intention from accessing personal location data to do real harm. We respectfully urge you to report these bills favorably out of committee as a practical next step to protect so many diverse communities. Thank you so much for your time and attention. Thank you.
NASIR ELEDRUSE - COLOR OF CHANGE - SB 197 - Thank you, chairs. Thank you, members of the committee, for this opportunity to speak. My name is Nasir Eledruse. I'm from an organization called Color of Change. We are a racial justice organization advocating on behalf of black people in the Commonwealth and across the country. I'm here testifying in support of the Location Shield Act, age 86, and S 197. This legislation is critical for protecting privacy and promoting racial justice. Members of the committee, what does mass data collection feel like? It might depend on who you are. For most people, your location data might be harvested from weather apps or your gas station finder, primarily to target you with ads based on where you go in both good and bad weather or after filling your tank. But the average phone has about 25 apps collecting and selling this data.
But what if you use apps specific to your community? Just last week, Muslims in my community in Worcester gathered for the Eid prayer. We often relied on using this app called Muslim Pro for prayer times and a pocket Quran. In 2020, we learned that13604 this app collected location data and that it was accessed by federal agencies, including the military, circumventing the Fourth Amendment like giving a landlord cash to access an apartment without a warrant. Imagine if you use any other app that is unique to a community that you are part of. Jewish communities use an app like Sephiria to interface with the Torah or other religious texts. Black people using the BLK dating app or LGBTQ people using another dating app, Grindr. Women using fertility apps, children on social media, nothing stops these apps from selling your location.
In 02/2019, the New York Times opinion team showed just13646 how specific this data is. They tracked a worker on their day off interviewing for a new job and even identified a musician who played in Trump's first inauguration band. The Heritage Foundation, a conservative think tank, proudly boasted online about being able to use this data to ID people and organizations helping migrants across the Southern US border. For those already marginalized by police or dependent on government support, this data is and could further be used without consent to make decisions about their lives or even threaten their safety. This is wrong. Companies don't need to track our every move to provide services. Massachusetts has the opportunity to lead on data privacy by enacting this legislation and protecting personal freedoms for all communities. I ask13697 that you please report this bill favorably and urgently out of committee, with an unstable federal government. Lives are at stake. Thank you.
FARLEY-BOUVIER - Thank you. Thank you all so much. Just checking in with the committee. It's terrific. We appreciate your viewpoints. Thank you. We're gonna go now to Sam Larson, and then we are gonna go to CJ Niket, who will be joining us virtually. Nice to see you, Sam.
SPEAKER7 - Nice to see
SPEAKER35 - you all.
SPEAKER52 - I will be brief.
SPEAKER1 - Appreciate your patience today.
SPEAKER52 - No. No. I'm the lobbyist. No 1 should ever
SPEAKER7 - be working. Okay.
SPEAKER1 - Noted. Yeah. It's my job
SPEAKER52 - to be here.
SAM LARSON - AIM - HB 80 - SB 33 - Okay. Good afternoon, Chair Farley Bouvier, Chair Moore, members of the committee, and staff.13736 You're all here, too. My name is Sam Larson. I'm the vice president of government affairs at the Associated Industries of Massachusetts. We are the largest statewide business association. On behalf of our 3,400 members, particularly our13748 small and medium-sized businesses, I13750 am here to testify in favor of House Bill 80 and Senate Bill 33, an act establishing13754 the comprehensive Massachusetts Consumer Data Privacy Act, followed by leader Hogan and Senator Driscoll. This legislation, if adopted, would create a New England framework to regulate the use of data privacy and doing so, give Massachusetts consumers some of the strongest comprehensive protections in the country and, at the same time, set strict limits on how businesses collect and use consumer data. Consumers who have only control their data by13776 deleting it or opting out of various services.
They also enjoy enhanced protection13780 on sensitive data and have to opt in for the use of that. Adopting a New England model will do 213786 things. 1, it's gonna establish consistent13788 rules, definitions, and regulations for our members, which in turn is gonna reduce compliance costs. Conversely, every unique state-specific data's private law privacy law increases compliance costs. For example, the California attorney general estimates that small businesses with under 20 employees spend on average $50,000 each just to comply with their state's data privacy law when it rolled it out. Regulatory uncertainty and chaos are the order of the day in Washington. I think during this hearing, we've changed our national trade policy about 3 times. And so we are asking that you create13821 some predictability here for businesses. Given the complexity of the subject13825 matter, we ask that you carefully consider13827 businesses that do not have sophisticated compliance teams, cannot afford outside counsel, and will limit the unintended13833 consequences for those companies.
Enforcement is equally important as compliance. Under this bill, the attorney general is charged with enforcement. And there is no private right of action. Legal comp legal compliance schemes, complicated as data privacy, enforcement is best left into the hands of, you know, responsible policymakers. And private attorneys, if given the power, will weaponize data privacy laws against businesses for personal profit. The financial incentives are simply too strong. Our decision to endorse this bill and the recently engrossed pay transparency legislation represents a new direction for the organization.13866 We've heard loud and clear that the business community needs to step up and offer solutions instead of just complaints. So we believe this bill does just that. I have thoughts on many other bills and will expand in written testimony in light of everybody's time. I'm open to questions.
FARLEY-BOUVIER - Sure. Just checking in with the committee. So, I have a couple of questions. Yeah. Okay. Let's do it. So we heard earlier. About how the more, the other Comprehensive data privacy bills are good13895 for small businesses. And there seems to be some conflict in that, yeah. Which bills are better for small business. And, so I'd like you to talk about small business. And Yeah. You know, Sam, the real small businesses.
SPEAKER7 - Yeah. Yeah.
LARSON - For sure. Many of them sit on our board and13911 talk to me very frequently. I think there13915 are maybe some benefits in the other bill,13917 but speaking to ours, I know that there are13919 clear13919 carveouts for many small businesses. And for those13923 that aren't carved out because they, you know,13925 sell, you know, large volumes of things, there are clear rules in place. If you have a customizable rule that doesn't follow a New England framework or neighboring states, it's gonna get extremely expensive to comply with it. And that's where we're coming from in terms of small businesses. Okay.
FARLEY-BOUVIER - Well, thank you for that. I have heard from several small businesses, and what they have said to me is that, especially those who work in multiple states. Yeah. Listen. We just go with13956 the strictest state, and that's what we do, and then we're good. And we don't worry about13960 all the others because13962 we're meeting that threshold, and then everything falls into place.
LARSON - I think this bill gets you there. And I think this bill puts you on a very strong path to have the strongest state rules, and we're not talking about at all.
FARLEY-BOUVIER - Okay. Thank you so much, Sam.
LARSON - I know. Appreciate it.
FARLEY-BOUVIER - Alright. Thank you for your time. Okay. We're gonna go now to CG, who is online. And then on deck, we have another online person, Olga Medina. CG, are you here?
LARSON - I sure am.
FARLEY-BOUVIER - Okay. Terrific.
CG NIQUETTE - FINDHELP - (R ) Excellent. Thank you. Good afternoon, honorable chairs and members of this committee, and thank you for the opportunity to speak today. My name is CJ Niquette, and I'm here on behalf of FindHelp, a social care data company that has proudly connected the citizens of Massachusetts to a network of over 9,000 free and reduced-cost services over the past decade. We are here in strong support of the Massachusetts Data Privacy Act, and we deeply appreciate the committee's leadership in moving this critical legislation forward. As technology continues to shape every aspect of our lives, from the way we work, interact with our families, we access essential services, data privacy has never been more important. In this landscape, age 78 stands as a vital safeguard for the privacy of Massachusetts residents, ensuring that individuals are protected from the misuse, exploitation, and inappropriate sale of their personal information. FineHills' mission is to connect people in need to the programs that serve them with dignity and ease.
As the nation's largest social care network and closed loop referral system, we handle significant amounts of extraordinarily sensitive data, and this includes social care data, information ranging from housing status to food and security to transportation needs. And since our founding in 2010, we have committed to never sell data. Our network is built on a consumer-directed privacy model where individuals will opt in to share14072 their information for each referral, and access to referral history is permission-based. Our industry moves at the speed of trust, and14080 our users trust us with their most personal information because we protect it. And that's why we strongly support the Massachusetts Data Privacy Act. This bill represents a step toward ensuring that all personal data is protected.
The bill's provisions on data minimization, consumer rights, and protection of sensitive data are essential as they ensure that data is only collected when necessary and that individuals have the right to access, correct, and delete their data. We humbly believe that the bill can be further strengthened to define social care data and include it among the protected data and banning the sale of social, or of banning the sale of monetization of data collected in systems like find help. In a world where data breaches and privacy violations are becoming increasingly common, age 78 represents the kind of forward-thinking legislation we need to safeguard the privacy of our citizens. By setting clear standards for how companies collect, process, and store personal data, Massachusetts will lead the way in protecting residents in the digital age. As technology continues to evolve, it's critical that our14141 laws evolve with it to protect the fundamental rights of Massachusetts residents. This bill is an essential step towards that goal. Thank you for your time, and happy to answer any questions.
FARLEY-BOUVIER - Committee? I14153 have a couple of questions. I'm a little surprised to hear sounds like you are a, you know, an upcoming tech company. You have access to sensitive data, which you could make a lot of money on, but you are in strong support of Comprehensive data privacy 100%. So this is not gonna hurt your business?
NIQUETTE - Nope. It wouldn't we would never, never have, never will, would never need to sell data. And so we're strongly in support of this and applaud your leadership on the issue.
FARLEY-BOUVIER - And alright. Is it a bill like this or, you know, squashing innovation?
NIQUETTE - Not at all. I think this honestly encourages innovation. And I think, ultimately, this is about trust. Right? For us, social care data relating to your housing status, your food insecurity needs to be protected. And if folks don't feel like their data is being protected, they aren't going to trust the systems and platforms that can help them, hopefully, at the end of the day.
FARLEY-BOUVIER - Thank you so much. Appreciate your testimony today. We're gonna now move on to Olga Medina. And then up next is Sarah Guggenhagen. I'm sure14228 I got that right from Epic Go ahead Olga
OLGA MEDINA - BSA - SB 33 - HB 80 - ( R) Thank you. Good afternoon, members of the committee. My name is Olga Medina, and I represent the Business Software Alliance. We are the leading advocate for the global software industry, so our members make the business-to-business technologies used by companies in every sector of the economy. BSA strongly supports a comprehensive national framework that provides consumers with meaningful rights over their personal data, but we also recognize that states are leading in adopting privacy laws, and we welcome your committee's careful consideration of privacy issues. My testimony today focuses on BSA's core priorities in privacy legislation, which include promoting interoperability with existing state privacy laws, distinguishing between controllers and processors, focusing on consumer privacy by exempting employee data, and providing for exclusive AG enforcement.
First, we urge the committee to recognize the value in adopting a privacy law that's consistent with other state laws. To date, nearly all states have agreed on the same structural model for protecting privacy but have adjusted that model to provide different levels of substantive protections for consumers. To illustrate the broad consistency across state privacy laws, BSA has published a document cataloguing different models of state privacy legislation, and we believe that 2 bills before the committee, SB 33 and HB 80, also known as the Massachusetts14328 Consumer Data Privacy Act, promote a consistent approach to protecting consumer privacy and should guide the committee's approach to privacy legislation. We also encourage you to distinguish between14340 controllers and processors.
Leading global and state14343 privacy laws reflect the fundamental distinction between processors, which handle data on behalf of another company,14349 and controllers, which decide when and why to collect a consumer's personal data. Privacy law should also create important but different obligations for both controllers and processors that reflect14361 their different roles. BSA also supports focusing consumer privacy laws on consumers. To do that, a privacy law should14369 clearly define consumers and exclude individuals acting in an employment contest. Furthermore, we urge the legislature to advance privacy legislation that provides strong and exclusive enforcement by the state's attorney general. State attorneys general have a strong track record of enforcing privacy-related laws and14387 should be provided with the resources to enforce any new state privacy law in Massachusetts. So, to conclude,14393 I'd like to thank the committee for the opportunity to testify. We would welcome a14397 chance to discuss these issues with you in more detail as you continue to consider this important topic. Thank you.
FARLEY-BOUVIER - Thank you. Well, I appreciate it. From the committee, all set. We're gonna now move to Sarah. And then on deck is Thomas Kadri.
SARAH GEOGHEGAN - EPIC - HB 99 - SB 47 - (R) Thank you, Chair Moore, Chair Farley Bouvier, and members of the committee. My name is Sarah Gagan, and I'm senior counsel at the Electronic Privacy Information Center, or EPIC. EPIC is an independent research and advocacy center focused on protecting privacy in the digital age. I am here to surveillance the pricing in grocery stores. Surveillance pricing or the practice of changing prices for the same good based on a person's browsing history and personal information, which results in 2 people seeing 2 different prices for Surveillance pricing is fueled by over collection and out of context processing of our personal information, but this practice is exponentially more invasive when it uses biometric information. Biometric information is highly sensitive because it is unique to each individual person, and unlike a credit card number, it cannot be changed if it's been breached. This information is too sensitive to be used for surveillance pricing purposes. Surveillance pricing is fueled by the data economy.
Apps, websites, wearable tech, cars, and smart devices constantly collect information about us and share this information with data brokers and advertisers to profile us. Surveillance pricing uses these profiles to deliver prices tailored to each individual. While this may seem like a neutral or even benevolent corporate favor to us, offering a nominal discount here or there, this practice costs consumers more money. Consultants recommend surveilling pricing to their clients, promising their profits will increase 2 to 7% in as little as 3 to 6 months. This is a practice in which companies extract and exploit our14531 personal information to charge us the highest price we are willing to pay.14535 This is corporations increasing their profits while taking money out of our pockets. Surveillance pricing unlocks an even more nefarious window when we are talking about essential products and services. Across Massachusetts and across the country, consumers are hurting. Our costs are increasing, including for absolute necessities like groceries. Companies should not be able to further exploit our personal information to charge us higher prices, which is surveillance pricing's ultimate goal. I support Bill H 99 and S 47 to prohibit our most sensitive information from being used to charge us higher and higher prices. Thank you for your time today.
MOORE - Thank you very much. Do we have any questions? Oh, I guess there's no 1 here. Thank you very thank you very much. Next, we have Thomas Kydre. He's still here.
SPEAKER7 - Hello. Can you hear me, okay?
SPEAKER6 - Oh, virtually. Okay. I'm sorry. Yep.
THOMAS KADRI - UGA SCHOOL OF LAW - ( R ) - Perfect. Thank you for yeah. Thank you for the opportunity to speak with you today. My name is Thomas Kadri, and I'm a law professor at the University of Georgia. I'm also the legislative and policy director at Cornell's clinic to end tech abuse. My work focuses on technology-related interpersonal abuse. So, I work with survivors who are experiencing this abuse. And my scholarship, in particular, a recent article that I've coauthored with Chinmayi Sharma and Sam Adler called Brokering Safety, explores how data brokers are complicit in various types of interpersonal abuse. And I'm here today really just to make the point that others have made. So to really echo it, that location privacy, in particular, is a huge concern for survivors.
Knowing someone's whereabouts can leave them intensely vulnerable when they're trying to escape an abusive relationship or a stalking14655 situation, and it can really jeopardize their long-term security14659 and recovery. And this is in part because companies are selling vast amounts of our data that create all sorts of revealing inferences. And for survivors, these inferences can have dangerous and even deadly consequences. And I'm hesitant to repeat what's been said already. And in the interest of time, maybe I'll just try and say something that I haven't heard spoken enough about. And that's the, what is really, I think, a traumatic burden that a lot of survivors now face in the digital age to be constantly monitoring online for traces of their data. Right? So there are risks involved in their data being shared, and those risks can have awful consequences. But there's also this added burden that they face to be checking up for what14707 is out there about them.
And this, I think, can be triggering and distressing for them in a way that is underappreciated because it forces them to be constantly grappling with the abuse that they've already faced again and again while they try and clear up any digital footprints that may be left. And data brokers are really kind of complicit14728 in this. And so laws like the Location Shield Act and some of the other privacy legislation that you have before you today could at least go some small way to alleviating that burden that survivors and many other folks face to be engaging in this kind of privacy self-management that is the norm of today. And so I just wanted to kind of highlight that because I hadn't heard that element of this situation being discussed as much. And so, for those reasons, I commend the committee and those of you who are proposing and supporting the Location Shield Act and other bills that I know contain similar protections in there. You know, lowering the chance that survivors' location data can be bought or shared with their abusers would be a huge help to them. And so, thank you again for allowing me to testify today.
MOORE - Thank you. Thank you for waiting this afternoon. Any questions, Steve? No. No? Okay. So next, we have a panel. Roy Rindberg. Do you see it with the panel? So, just start whenever you want and just identify who you are as you go to speak. Thank you.
SARAH RADWAY - HARVARD UNIVERSITY - SB 45 - HB 78 - And members of the committee. My name is Sarah Radway, and I'm here in support of S 45 and HB 78. I am a PhD student in computer science at Harvard. My research, though, is interdisciplinary, so I spend a lot of time at the law school as well. My research focuses on looking at how consumer-facing technology companies interpret law and policy when they design their systems. So, for example, in my research, I may reverse engineer the implementation of a consumer device that collects, for example, biometric data to see how the devices collect, protect, process, and store user information. So essentially, I perform compliance evaluations of consumer tech, and I see how it's possible for companies to make privacy protective choices in the design of their products or not so much.
In terms of problems I see in my research, I'll keep this short, that are relevant to this legislation. I see that current legislative structures do not encourage a culture of privacy. In my work, I've seen organizations intentionally make it very difficult to understand what their data collection, protection, processing, and storage process practices are. Companies want to ensure that they will not be punished for behavior that they view as out of alignment with consumer expectations. So I have14895 to spend 1 to 2 years trying to understand the practices associated with 1 device from 1 single company, which is a pretty long time for me. I also see that there's a lack of accountability or enforcement capability when companies may be out of compliance. So enforcement largely relies upon14912 government agencies.
So Yeah. In terms of how14916 I think this legislation addresses these bills or these issues well, I think that these bills encourage principles of transparency and privacy by design. So by addressing features like dark patterns, they allow consumers to make informed choices, and they require transparency about what data controllers have about consumers and allow for a relationship to form between the companies and consumers. So it kind of negates this need for me to go through and reverse engineer all these implementations just to answer these questions. Additionally, by introducing a private right of14947 action, I think this creates the opportunity for accountability for non-compliant organizations and puts sufficient guardrails in place for misbehavior. Thank14955 you for your time. I appreciate it.
SPEAKER6 - Thank you.
SHANNON ESPINOZA - CONCERNED CITIZEN - HB 78 - SB 45 - SB 29 - HB 104 - Dear chairs, vice chairs, and members of the committee, thank you for the opportunity to testify. My name is Shannon Espinosa. I'm a Filipina immigrant, a Boston Latin School graduate, and a senior at Northeastern, double majoring in data science and biology. I've conducted undergraduate research in computational drug discovery, represented the United States internationally in quantum computing and AI competitions, interned at a congress on AI policy, and served as a visiting scholar at the United Nations. I'm currently a Fulbright semifinalist in computer science research, and I'm here today to strongly encourage your support for H 78, the Massachusetts Consumer Data Privacy Act, as well as S 45, S 29, and H 104, the broader Massachusetts Data Privacy Act. And we are grateful for this committee, especially Chair Moore for for confronting 1 of the most overlooked15005 and urgent ethical rights issues of the digital age.
So, my generation has never had the luxury of privacy. Take the college board a few years ago, it saw the data of over 237,000 students in New York. Data is surrendered not by choice but as the price of access to higher education. When I applied for scholarships and research funding, I would end up on websites seeking excessive personal information from 7 years15028 of address history to medical disclosures. And I filled it out anyway because what15032 choice did I have if I needed to fund my education? The Commonwealth of Massachusetts is the national epicenter in research, health care, tech, and education. Industries handling some of the most sensitive and high-stakes data in the country. And handling some of the most sensitive and high stakes data in the country.
As a student researcher, I work with protected genomic and health information. Like institutional data sets used to analyze patterns of cognitive resilience and early neurodegeneration among those with Alzheimer's disease. And that15057 data is locked behind review boards, ethics approval, and secure access protocols because it involves real people's biology and features. When designing quantum and AI frameworks, I scrutinize datasets for accuracy, sensitivity,15069 and risk. I'm trained to ask, is this data necessary? Are we protecting the subject's privacy? And, ironically, I have fewer rights over the data collected from my own browser history than I do over anonymized neurons from research patients. The digital tools I've relied on since grade school, learning apps, test prep tools, and even school Wi Fi have quietly harvested and sold my personal data even as a minor.
For years, virtually no oversight. My location, device ID, academic performance, and search patterns are fed into a black box algorithm I'll never see. And I'm held to a high ethical research standard when handling others' data but denied agency over my own. We are 3 among the approximately 1 and a half million students in Massachusetts from kindergarten through college facing15114 the digital norm of silent invasions of privacy every day. And I respectfully ask that the committee give a favorable report to H 78 and S 4515122 with a minimum of the critical the 3 critical protections, strong data minimization, a ban on the sale of sensitive information, and a private right to action. You have heard from others about the harms associated with the sale of location data. So, I also urge the committee to advance H 86 or S 1 85, the location shield act, should a full omnibus bill not move forward in this session. This is a once-in-a-generation opportunity for the Commonwealth to lead the nation in protecting the most basic digital rights of students, families, and future15151 generations from the perpetuated status quo of data exploitation. Thank you so much for your time and consideration.
ROY RENBERG - HARVARD UNIVERSITY - HB 78 - HB 104 SB 29 - Thank you, Chair Moore, Chair Farley Pavier, and the rest of the committee. My name is Roy Renberg. I'm a computer science PhD at Harvard University, and I have previously worked as a software engineer and as a cybersecurity consultant at a fine financial technology company. I live in Cambridge, Massachusetts. My research focuses on the company. I live in Cambridge, Massachusetts. My research focuses on data privacy and minimizing the ability of algorithms to potentially learn potentially harmful things about individuals. I'm here to speak about the importance of privacy laws, and in particular, I'm in support of H 78, Massachusetts Consumer Data Privacy Act, SB 45, HB 104, S 29. In particular, I wanna talk about privacy laws in relation to data brokers.
MOORE - Excuse me, can you just talk up a little bit louder so everyone can hear.
RENBERG - In particular, I wanna share and wanna speak about privacy laws in relation to15213 data brokers. I'd like to share a personal anecdote. Before this hearing, 2 weeks15217 ago, I issued a data access request and data deletion request to 1 of the major data broker companies, Acxiom. And this morning, I received an email containing 53 pages of my own personal information, including every apartment I've ever lived in, inference about my credit scores, and perhaps most concerningly, my Social Security number. For companies like data brokers, having15238 this data is bad for many reasons. It enables15240 discrimination and exploitation. It makes me more vulnerable to security breaches. I share this because as someone who works full time on data15248 on data privacy, if a single data broker has 53 pages on me, I can only imagine what parents, teachers, and seniors, people with less time or technical know-how might have technical know-how to protect themselves.
The only reason I was able to request my data was thanks to the California privacy law, CCPA. As a Massachusetts resident, I have no such act and no such right. At the moment, these companies don't check where I am. That's not a legally protected right at this moment. I stand before the Massachusetts legislature asking to pass a comprehensive Massachusetts comprehensive privacy law emphasizing at least 3 concerns: right to knowledge, right to erasure, and a strong privacy by default for data minimization as opposed to consent-based privacy. I urge you to act. Thank you. I thank the legislator for his time.
FARLEY-BOUVIER - Terrific. Thank you all for coming. I wanna turn to the committee to see if there are any questions. I think anytime we have students in the space, it's a very good thing. So I really do appreciate you being here. Thank you. We're now gonna move to Elizabeth Mitchell, who is online, and you are part of a panel. So this would be a panel. Then we will move to if she is still here, except for her teach out. And after that is Caitlin, Vergara.
SPEAKER13 - Elizabeth?
ELIZABETH MITCHELL - MAMA - HB 78 - SB 45 - SB 29 - HB 104 - ( R)Yep. I'm here. Thank you so much. My name is Elizabeth Mitchell, and I'm testifying today on behalf of MAMA, Mothers Against Media Addiction, in support of H78, S45, S29, and H104. Mothers Against Media Addiction is a grassroots movement of parents and allies fighting back against media addiction and creating a world where real-life experiences and interaction remain at the heart of a healthy childhood. We have a 3-part mission: educating parents, getting smartphones out of school so kids can learn, and ensuring technology products have basic safeguards like other consumer products.
Like Mothers Against Drunk Driving, we are standing up to massive public health threats and making changes in our homes, communities, and across the nation. We appreciate the leadership of this committee, and in particular, the chairs have shown themselves to be enacting meaningful privacy protections for Massachusetts residents by sponsoring these bills. Part of the work at MaMa is raising awareness of the critical importance of protecting children's data, as data is often used in ways that are ultimately not in children's best interests. Last year, the College Board reached a settlement with the New York Attorney General and the New York State Education Commission for collecting students' personal information when they were taking the PSAT, SAT, and AP exams in schools, and then selling that data to colleges, scholarship programs, and other customers who used it to solicit students to participate in their programs. This was in violation of New York State privacy laws, which require15413 consent for such transfers. The investigation found that in 2019 alone, the college board had improperly sold the personal information of more than 237 New York students.
The more egregious example also comes from New York, Seattle, and Baltimore, where contracts worth millions of dollars were signed with the virtual mental health care provider in their Talkspace to support struggling youth. According to privacy advocates, Talkspace was leaking data about who visited the website to TikTok, Meta, Snap, and other social media companies that multiple attorney generals are currently15444 suing for harming teen mental health. Consent alone won't fix this problem. Parents of teens cannot be expected to read every lengthy15452 privacy policy they are faced with in15454 order to prevent their data from being sold. And even if they did read those policies, they are left with a take it or leave it choice. A15456 policies, they are left with a take it or leave it choice. A team would have to choose between taking the SAT and preventing their personal data from being sold. It's not a real choice.
Unfortunately, most state privacy laws don't do enough to protect people's privacy. These laws, including the Virginia and Connecticut laws, most often cited by the industry as models states should follow, simply cement the status quo into law. Unfortunately, the bills I'm sorry. These bills contain the 3 most critical elements needed in a strong privacy bill: strong data minimization rules number 2, restrictions on the sale of sensitive data, and strong enforcement mechanisms. These bills require that entities only collect, use, and transfer data that is reasonably necessary and proportionate to prove or maintain a product or service requested by the consumer. They set a heightened protection based on heightened protections for sensitive data that cannot be used for advertising purposes and ban of sale, a protection included in the recently enacted Maryland Online Data Privacy Act that we urge Massachusetts to adopt. Sorry, I'm out of time, but please also pass the Location Shield Act, and thank you for the opportunity to testify.
FARLEY-BOUVIER - You're terrific, Elizabeth. Thank you so much. Also on this panel are Ariel Fox Johnson and Ava Ava Smithing.
SPEAKER15 - Ariel? Hello.
ARIEL FOX JOHNSON - COMMON SENSE MEDIA - HB 78 - SB 45 - ( R ) Thank you, chairs and committee members. My name is Ariel Fox Johnson, and I'm a senior advisor in data privacy with Common Sense Media, the nation's leading nonpartisan organization dedicated here today in support of H78 and S45. These bills are critical steps towards protecting kids and teens online and minimizing the use of everyone's personal data. Current federal law does not protect teens and does not adequately protect children. Youth are especially vulnerable to privacy harms because they spend a lot of time online, including on school, and their brains are still developing. By age 4, more than half of kids have their own devices, and research shows that young teens spend an average of 3 and a half hours a day on social media. Further,15583 kids' brains are not fully developed, and neither children nor teens have a good understanding of long-term consequences online or elsewhere. Young children don't understand what happens when they share information online.
They believe deleting an app or information in that app deletes it from the Internet. They do not expect or understand that a game that they play may gather information about15604 them from external sources. They can't understand advertisements or that they're trying to sell them something. Teens are primed to prioritize rewards and ignore risks and are15614 susceptible to being. Information is collected from children and teens, and it can be used to label and limit them, affecting their academic15630 and economic opportunities. It also makes them targets for data breaches. Massachusetts must act now to enhance privacy protections for children and teens. Protections should be by default. Kids and their parents shouldn't bear the burden of learning how to change privacy settings. Protection should include bans and targeted ads. Flat prohibitions are best here because they ensure no 1 will be tricked into consenting to something they do not understand or think is required to use the service. It's also critical that companies not be able to turn a blind eye to youth being on their platforms.
Companies like Meta and TikTok have avoided compliance with15666 current federal privacy law by pretending that they don't have kids on their site because the child didn't self-identify as a kid. At the same time, these companies profit off of such child users for advertising purposes because their algorithms have identified them as children. Companies must protect youth when they know or should have known about young users on their site, at the very least when they willfully disregard youth. Companies should not only be liable when they have nearly construed actual knowledge of children. We know kids are using social media, and companies know it too. It's time we hold them accountable and protect kids and teens online. Other states have done so, and I will note without triggering lawsuits. And Massachusetts residents deserve the same protections as well. Thank you.
SPEAKER1 - Thank you. Ava?
AVA SMITHING - YOUNG PEOPLE'S ALLIANCE - HB 78 - (R) Hi. Can you hear me?
FARLEY-BOUVIER - Yes. We can. And we can. The video is great, too.
SMITHING - Okay. Perfect. Thank you, members of the committee, and Chair Bouvier and Farley Beauvoir and more. Thank you for having me here to testify in support of age 70. My name is Ava Smithing. I'm15729 the advocacy director at the Young People's Alliance. We're a youth-led and youth-run organization, and our mission is to elevate youth perspectives and policy conversations that impact us. I want to share with you today why this bill matters so deeply to me, the young people I represent and why we feel so strongly it should become law. When I was 12 years old, I downloaded Instagram. What had started as innocent scrolling quickly changed, and the platform started showing me bikini advertisements of my friends.
These ads featured models with unrealistic bodies, and as a young peer-sensitive girl, I found myself lingering on them. My human instinct to pay more attention to threatening or native information was exploited. The platform detected my lingering, stored it as data, and wrongly inferred I would want more of that content. My insecurity became a data point. It was linked to my profile, and through collaborative filtering, it determined what content and ads I would see next. Platforms analyze patterns between my data and the data of millions of users. Because users who looked at bikini ads also tend to engage with exercise ads and content, the algorithm predicted I would do the same. This pipeline took me all the way from exercise videos to diet tips and, eventually, to content that promoted an eating15796 disorder. I never consciously chose to see any of this content. My15800 data, personalized recommendation systems, and first-party advertising are chosen for me.
The platform shared what they had15806 learned about me between themselves and outside advertisers, and the harmful content followed me across the web into different social media platforms. This reinforced that what I was seeing was what I was supposed to be seeing and that my eating disorder was how I was supposed to be acting. It was inherently it was the insecurity-based beauty industry that we all know on crack. A supercomputer following me everywhere I go, learning from my every action how to be15830 more effectively predatory, adopting itself specifically to me. The harmful content was damaging enough, but what made it truly life-threatening was how the platforms kept me engaged. They use variable reward schedules, the same psychological mechanism used in slot machines, to mix content that triggers both positive and negative reactions in an unpredictable order based on what my data said I would react to. This kind of reward schedule leaves users constantly scrolling and always looking to that next dopamine hit.
Together, these 2 mechanisms addict users to a perfectly tailored feed of increasingly extreme content that exploits their most personal biases, fears, and insecurities. This process is not designed for user well-being or exploration but solely to maximize the time spent online and generate profit for the platforms. H 78 addresses this harmful design at its source. By classifying miners' data as sensitive data, it prevents platforms from building excessive profiles that track in securities, and it also prevents platforms from manipulating content in an order that would addict us to it addict us to its delivery. By prohibiting the sale of sensitive data and banning both targeted and first-party advertising to minors, age 78 protects young people from ads designed to exploit their vulnerabilities. So it's because of this that Massachusetts15905 has an opportunity to establish a pry a strong data privacy minimization requirement, prevent the sale of sensitive data, and enforce this all with a strong federated action to ensure companies will be held accountable. I urge you to report H 78 favorably. Thank you for your time and consideration.
FARLEY-BOUVIER - Thank you. Thank you for the testimony of all 3 of you. To the panel, well, first, we'll turn to the committee. To all of you, yeah. Go ahead.
MOORE- I'm sorry. Do you know that this language has been included in other states?
JOHNSON - Which specific language are you talking about?
MOORE - On that you just talked with the youth language?
JOHNSON - Yeah. So, for example, Maryland has flat-out banned targeted advertising to kids 18, which I think is good. It also has the knowledge standard no newer should have known. California has willful disregard. So, there are different states. And I'm sorry. There are many states that have special protections for teenagers now. 10. But some of the strongest ones are in Maryland and California, and New York passed a child privacy act last year as well that has good productions.
MOORE - Thank you.
FARLEY-BOUVIER - So to follow up on that, I think that what we do see in cross states is the definition of minor. At age 78, we define a minor as 18, and so your thoughts on that. And any of you can jump in on that.
JOHNSON - So yeah. Sorry. I didn't mean to go ahead. Talk. I think that's where people are moving. That's where our lawmakers are moving to 18. You know, California was the first law, CCPA, to protect teens, and it was under 16, and that was a big fight. I worked on that. Maryland is under 18. New York's Child Data Protection Act is under 18. We see federal proposals now up to 17 and sometimes 18. So while there is a variation, we are supportive of under 18.
FARLEY-BOUVIER- Thank you. Does anybody else from the panel have thoughts on that?
SMITHING - Terrific. Leaving under 18 is the right oh, I am. Am I muted? No.
FARLEY-BOUVIER - No?
SMITHING - We leave under 18 is the right way to go. These harms don't magically disappear when you turn 17.
FARLEY-BOUVIER - Thank you. Appreciate that.
JOHNSON - Okay. Yep. I totally agree with the other panelists.
FARLEY-BOUVIER - Excellent. Alexandra Thorne is on deck. Good afternoon, and welcome to you.
SPEAKER11 - Good afternoon.
SPEAKER1 - Just pull that right up to you.
CAITLYN VERGARA - HARVARD UNIVERSITY - HB 98 - Thank you, Chair Farley Bouvier, Chair Moore, and the members of this committee. I know we've been here for a long time16098 today.
FARLEY-BOUVIER - We appreciate your patience.
VERGARA- My name is Caitlyn Vergara, and I'm a child online safety advocate. I'm a research assistant at Harvard Medical School studying youth mental health. Before HMS, I was an online child safety researcher across both industry and nonprofits for the past 3 years. But today, I'm here as your constituent, a resident of Austin, and a supporter of H 98, the Act for Internet privacy rights for children. I'm here to further our conversation today on how targeted ads are harmful to children. I know that there are many powerful companies that don't want these acts to be passed, but I know my voice stands with our community. Who says that childhood is not for sale? As the first generation to grow up with access to the Internet, I remember being on these social media platforms talking to my friends, but with 1 swipe, I would see advertisements for online dating.
I was only 14 at the time. Many youths share similar experiences. 1 in 7 9 to 12-year-olds reported having an online sexual interaction with an adult. In 2021, '10 percent of 1316187 to 17 year olds have reported to use OnlyFans, a company that profits off of people selling their own nude images. So we have to ask ourselves, how do harmful advertisements groom children into accepting adult-like relationships? I know from working in the industry that companies have the ability to differentiate between child profiles and adult profiles. These things have been said again and again over the course of today. But will today's children have to wait until they are my age to sit in front of you talking about the same privacy issues? Childhood is not for sale. This committee knows that. The committee knows that. The children of Massachusetts know that. But will the laws of Massachusetts reflect whether or not that is true? Thank you for your time, and thank you for your public service.
FARLEY-BOUVIER - Thank you for that strong testimony. We appreciate it. Are there any questions or comments from the committee? Terrific. Thank you again for being here, and thank you for your patience today. We're gonna now move to Alexander Thorne, and up on deck is Erica Teskowick. No. That was really bad. Sorry. Thank you. We really appreciate your patience with us today.
ALEXANDER THORNE - CONCERNED CITIZEN - HB 104 - SB 45 - SB 29 - And I appreciate yours. Thank you to the chairs, to the committee, and to everyone else who's still here. So, I am here to testify in favor of the strongest possible comprehensive protections for digital privacy and enforceable regulation of the collection and sharing of personal data, including geolocation data, transactions, and personal communications. The need for modernized data protections has steadily increased in recent years, as we well know by now, and is even more acute under a presidential administration with a clear I therefore want to put, throw my support behind the Massachusetts data protection Data privacy act, H 104, S 45, and S16317 29, and the Location Shield Act, H 86, and S 197, as providing an important step forward. I'm not gonna go through all of the various abuses that you don't, that you are aware of by now, in terms of what kinds of data are collected16334 on us, but I wanna highlight the role of data brokers, and particularly, the critical need for better protections, in the current political context.
Even before the current administration, Immigrations and Customs Enforcement bought huge volumes of cell phone location data from the data brokers, Ventel and Babble Street. This is only going to accelerate if left unchecked. Real-time geolocation data likely facilitated the abduction of Tufts graduate student Ramesa Ozturk from the street near her home in Somerville on March 27. I live in Somerville, and I work at Tufts University, and I am feeling the pain of this as an attack on my own community. Similarly, the detention of tourists who have criticized Donald Trump continues a pattern of ICE using media monitoring data brokers, such as Ghost and Shadow Dragon. The administration has clearly signaled that it does not want to stop at legal or undocumented immigrants but is prepared to move on to citizens as well. The time to act is now. Please pass the strongest possible comprehensive data privacy bill.
FARLEY-BOUVIER - Thank you, Alexander. We really appreciate your testimony and your patience today. Are there any comments or questions from the committee? Good. Thank you so much. Could we go to our questions from the committee? Good. Thank you so much. If we could go to16422 Erica Teskowick and then go back to virtual, we have Edward Klein. Welcome, Erica.
SPEAKER19 - Hi. It's16430 a beautiful
SPEAKER1 - sign when I knew this. We're very16432 happy to have you here.
SPEAKER16 - Happy to be here.
ERICA TESKOWICK - CONCERNED CITIZEN - HB 99 - SB 47 - Thank you to the committee and the16438 board members. So I live in a particularly rough part of the South Boston area, a neighborhood where most people I encounter, including myself, rely on food stamps or some other public benefit. But with the current socioeconomic climate, those benefits aren't going to take us much further along in the months to come. And to the addition of dynamic pricing through the usage of dangerously discriminate AI tools will not only cripple a source of sustenance and culture for my neighborhood, which is food, but black families and other families of color make up for a majority of my neighborhood, and this will make them fear for their safety and privacy while going out for a simple errand most of us take for granted or pay no mind to. I think back to that brilliant group of teenagers at the Hyde Square Task Force who, back in 2023.
Through their own grassroots experiments, they discovered that Stop and Shop charged more and more diverse neighborhoods while charging more affluent neighborhoods for the same exact name-brand items. And I can't16496 help but worry about AI enabling such corporations to harm even more Massachusetts families this way. For a public housing complex that was originally built in the 19 thirties, I often feel like my neighborhood is still stuck in16508 the past. Yet, it has withstood numerous attempts to erode its community far before my time. Its people still find ways to connect and help each16516 other. We will lend a neighbor a cup of sugar, or in my instance, lend a neighbor a few slices of pizza when she hasn't eaten in 2 days. And we help each16524 other when and where it counts. And we don't want nor need the likes of proven oppressive technologies to incriminate that. I'm here in support of the bills H 99 and S 47.
FARLEY-BOUVIER - Thank you so much for your testimony. It's really great to have you here. Any comments or questions from the committee? Senator? Good. Thank you. Thank you again.
SPEAKER8 - Have a good night.
FARLEY-BOUVIER - We're just checking quickly. Is Edward Klein online? It seems like he's no longer with us. So we are gonna now go to Kate Crockford. And, on deck is James O'Keefe, who is also online. Nice to have you here, Kate.
KATE CROCKFORD - ACLU - Hi, folks. Good evening. Oh. How are you? You did well. Nice to see you all. Thank you so much for coming back after your votes and to the chairs and your staff for all the hard work that you all have done on this issue, this session, and the last session. I wanna sort of take a step back and talk not about the16584 specifics of the bills themselves so much, but about the moment we find ourselves in as a country and the importance, I think, of passing strong data privacy legislation. I don't need to tell you about the crisis that we're in. But I do wanna say that for far too long, the wealthy and the powerful in this country have been able to write the rules. And in the case of consumer privacy law, maybe avoid rules altogether. Right? For years, we've been told that Big Tech is working in our favor. But I think recent events, such as the parade of big tech billionaires lining up to pad the president's pockets at his inauguration, should clear up any doubt about where they really stand. Data is the new oil. That's what we hear all the time.
And, 1 of the most precious commodities that enable companies of all different kinds to control us, manipulate prices, stifle competition. And so I wanna ask the committee, and through you, I hope, other members of leadership in this building, to think about data privacy regulation the way that our predecessors, your predecessors, people in the 19 sixties and 1970s, thought about environmental protection. Industrial polluters in the 19 sixties and 1970s warned that regulations to stop them from pumping toxins into our water and our air would hurt their profits and disadvantage American businesses. And today, we hear representatives of this of similar Robert Barrons from the tech industry saying similar things about stopping meaningful privacy regulations. The reality is that, like those polluters, companies16681 profit from the collection. But the benefits to the people of Massachusetts and to the country, like clean air and water, are more than worth it. Personal autonomy, democracy, economic freedom, so many of the arguments that we've heard from so many different people representing so many different communities today. So I wanna say something about elite16713 impunity.
I think 1 of the reasons that we find ourselves in this situation today with Donald Trump as the president again is the sense among voters that people in power don't listen to them and don't care about them, that politicians are corrupt, that, you know, we gotta drain the swamp. Right? That the government doesn't work for16735 regular people, but works for the rich and powerful and the well connected. And so before you, you have legislation that you 2 filed, as chairs, as well as the Location Shield Act, that huge numbers of Massachusetts voters support. Right? The Location Shield Act, polling shows, is supported by 92% of likely voters. That's left, right, center. Our federal government is in crisis. It's a crisis that's enabled by Silicon Valley Billionaires. States like Massachusetts, during this crisis, have an obligation to respond not merely by filing lawsuits as our attorney general is doing, but also by legislating, by demonstrating that here, the will of the people reigns, that no matter what's happening in Washington, we are a democracy. And in our democracy, we strongly support your legislation as well as the Location Shield Act and, really, any bill that the committee puts out that comprehensively deals with these issues. And that the committee puts out that deals with these issues. And as my many colleagues have said, you can tell our message discipline is pretty good. Right? We want strong data minimization, a ban on the sale of sensitive data, and robust enforcement through a private right of action. So thank you very much, and thank you to your staff as well. Thank you.
FARLEY-BOUVIER - Thank you for your strong testimony. I appreciate you being here. Are there any comments or questions from the committee? Senator?
MOORE- Thank you, Kate, for coming here. I was actually down in Washington last week. I didn't see you this time. But thank you for testifying, and I think you based it on the legislation that's filed where the like. I think the consensus opinion is on what we should be looking at. But as you16844 and the ACLU have done in the past, you know, we need your help. I know it's not gonna16850 be to the gravity that was out this past weekend. People across the country and the state, but we need to get this legislation passed. We are gonna need a grassroots effort here in the state, in the state house, to help us achieve these goals. So thank you. And, again, thank you for coming and testifying, and great testimony.
FARLEY-BOUVIER- Thank you, Senator. Thank you very much. Appreciate it. Thank you. So, it's our understanding that James O'Keefe is no longer with us here. But so we're gonna move on to Ryan. No. Wait a minute. Ariel Garcia? Here?
SPEAKER11 - Did you
SPEAKER22 - say Ryan?
SPEAKER1 - Oh, yeah. Ryan. Oh, sorry. We're gonna do First, we're gonna do, Ariel and then Ryan. Okay? Thank you.
SPEAKER16 - Okay. Alright. Thank you.
SPEAKER1 - There you go.
ARIELLE GARCIA - CHECK MY ADS INSTITUTES - SB 45 - HB 78 - ( R ) Chairs Moore, Farley, Bovier, and members of the committee, thank you for the opportunity to testify today in support of H 78, the Massachusetts Consumer Data Privacy Act, and S 45. My name is Arielle Garcia. I'm the chief operating officer of Check My Ads Institute, an independent nonprofit watchdog advocating for a transparent and fair digital ad market. Prior to joining Check My Ads in 2024, I spent a decade in the ad industry at UM Worldwide, a global ad agency. I resigned because I realized that the only winners in the industry today are big tech giants like Google and the data brokers and advertising technology, or ad tech, middlemen that thrive in their shadows. My testimony today will challenge common fallacies pushed by the big tech led industry lobby. The $700,000,000,000 digital ad industry is the main business model of the Internet, and it is broken.
Firms like Google and Meta have built empires on unchecked extraction. Their power has set norms that entrench their power while har harming users, advertisers, and publishers. At the heart of this model is programmatic advertising, which relies on real-time ad auctions powered by constant tracking. Is an opaque and unregulated system isn't just bad for privacy, it's also bad for businesses. Only 36¢ of every programmatic ad dollar reaches the publisher.16980 The rest goes to ad tech16982 middlemen or is lost to fraud and spam. Ad fraud alone accounts for over $84,000,000,000 globally, second only to the drug trade as a source of revenue for organized crime. Now, how can that be? The reality is that the consumer data peddled by data brokers for ad targeting is shockingly inaccurate. Research has shown that leading data brokers were only right about people's gender 42% of the time, worse than guessing at random. This is what advertisers are paying for. Acxiom, 1 of the largest data brokers, has17014 admitted that their data is based on informed guesses.
Their chief privacy17018 officer has said that he hopes that if they guess wrong, it doesn't result in17022 denial of benefits or credit. But credit agencies have purchased17026 Acxiom data. A different data broker owned by Ad giant Publicis has sold data to the perpetrators of elder fraud. As Attorney General Campbell may recall, Publicis's health unit in 2024 entered a $350,000,000 settlement for its role in the opioid crisis, where it was accused of using doctor-patient recording data to gain insight on how to push opioids on pain sufferers in higher doses. This is not efficiency. It's a dangerous market failure. Without privacy laws that codify transparency and choice, include principles of data minimization, and restrict the use of sensitive data, these harms will continue. To close, these bills would not prevent relevant advertising. What they would do is reduce the supply of low-quality and harmful data and foster a system that works better for both Massachusetts businesses and for the safety and liberty of its residents. Thank you once again for the opportunity to speak today.
FARLEY-BOUVIER - Sure. Thank you. It's a whole different view. Members of the committee? Thank you, Ariel. We really do appreciate your testimony. Up next is Ryan Kearney, and we will then go17098 back online to Grace Geddie.
RYAN KEARNEY - RETAILERS ASSOCIATION OF MASSACHUSETTS - HB 80 - SB 45 - SB 29 - HB 104 - Chairwoman Follett Bouvier, Chairman, and more members of the committee, thank you for the opportunity to testify before you today. For the record, my name is Ryan Kearney. I am17111 the general counsel of the Retailers Association of Massachusetts. We are a statewide trade association of 4,000 member businesses in the retail restaurant and17119 wholesale sectors of the retail industry, and our membership ranges from the independent over17123 owner operated business that'll operate in every single 1 of your, districts on Main Street, all the way up to the regional and national chains. I'm here today to testify on the various consumer data privacy bills before you. Ran as in support of house 80 and senate 33, and in opposition to house 78, house 104, Senate 29, and Senate 45. By way of background, the collection of consumer data is an essential part of any retail business. It's utilized for various purposes, including conducting sales transactions, deliveries and returns, detection of fraud, and maintenance of loyalty programs.
In each of these instances, the data that's provided by the consumer allows for more efficient business operations and a more positive customer experience. Simply stated, retailers utilize consumer data for the principal purpose of serving their customers as they wish to be served. In speaking with my membership, none of them have said that they would be able they would not be able to comply with any of these laws. The concern is that the costs and obligations and potential liability created by some of these proposals will interfere with their ability to continue to serve customers in the manner that they are accustomed to and in the way that customers expect them to. For example, retail loyalty and discount programs, which are widely popular amongst consumers, could be negatively impacted. 72% of American adults online are members of 1 or more programs, with the average adult belonging to 9 of these programs.
They rely on the retailer's ability to provide exclusive offers to voluntary participants versus nonparticipants, and where non participate non-participation is a result of a consumer exercising a right afforded to the consumer by privacy law. There's an inherent compliance conflict when nondiscrimination provisions are included. To allow for the continued offer of these programs, any proposal that is adopted by this committee or referred favorably by the committees should make clear that nondiscrimination provisions should not be interpreted17241 to prohibit, restrict, or limit the ability of retailers to maintain these17245 bona fide loyalty programs. I'd be happy to work with the committee to make sure that language is, in fact, where it needs to be to do so. It's been mentioned before, the about uniformity of laws. RAM does believe that regulation and collection of consumer data privacy should be dealt with in a uniform manner across the country.
This would not only ease compliance but also provide equitable protections and rights for all consumers across the country. With 20 states already having laws in place, Massachusetts, should be looking for an to should be looking to the emerging body of state law for guidance rather than crafting its unique framework, which would come with additional costs for compliance. In today's economy, businesses are rapidly connected across state lines, in particular here in New England, due to our close geographic, high-level mobility concerns, that it would make for more streamlined, and expectation from the consumers. Remaining in line with state law, particularly those adopted17306 in every state, would provide, again, that uniform expectation locally and, again, needs compliance. We, therefore, post '80 and Senate 33 as they are modeled again, adopting a growing number of states across the country, including various states here in New England. I'm happy to work with the committee over the course of this session to use language, in a way that it would be, you know, create their rights and protections that we're looking for, but also, having to interfere with operations17333 and also with the people in Wisconsin, Massachusetts businesses.
FARLEY-BOUVIER - That's terrific. And I would very much look forward to working directly with you on language to ensure that your members, particularly, again, I'm gonna focus on small business, true small business. Understood. Yeah. We heard, a lot of testimony earlier that talked about, for example, age 78 being good for small business, because the big data folks are hoovering up, you know, both consumer data, harming small business, but also the small business data that can get you know, take business away from them. So I'd like your reaction to that and how, again, you know, you have a responsibility to17379 your members, including those of the true small business.
KEARNEY - Sure. So I will echo the sentiments that you heard from the gentleman in17387 the name. We believe that House 80 and Senate 33 do protect, small businesses. 1 thing that you're going to hear, whether it's small business or large businesses from our association throughout this session, is that, there's a focus in the Haley administration and in from this legislature competitive in competitiveness here in Massachusetts. And whether it's this or any other law, we're gonna ask you all to take a look through look at these legislations that you're passing through the lens of cost compliance and what's is gonna happen what what would potentially happen, once you implement these things. On this issue in particular, our concern is that if you create a Massachusetts-only, that's then have to be replicated, there's really 2 ways to for compliance, which have already been been mentioned.
You create your own compliance framework here in Massachusetts, which is gonna cost additional dollars and not just copy-paste what you see in other states. Or you take all those 20 states that you've made investments in already, and you tie them up to the Massachusetts standard. You can do that. And that's that's perfectly fine. And I'm assuming from my conversation, my members would be able to comply with that. It may cost it cost them additional dollars, but you have to be aware that those additional dollars come in and are seen as a mandated cost to businesses in a state that's already pretty much known for having a high cost of doing business in the country.
FARLEY-BOUVIER- Okay. You mentioned, you mentioned, other states including especially this New England like a New England model. You aware of the work that's been done in Connecticut. A few years ago, they started what I call Connecticut 2. They made immediate changes to that to update it. And now the lead sponsor, who is a national figure now in this space, is updating his laws. And quite frankly, it almost mirrors age 78.
KEARNEY - Sure. I understand that there have been updates, but there's also been 2 other states in the New England region that have adopted that original Connecticut 1 of model. And so again, to, you know, I think you and I have had a conversation when we first had this conversation. It was your trip to England that you had mentioned. How confusing it was, whether I opt in, whether I opt out. We don't want that in our system. Despite what some folks may think of the retail industry, our consumers are 1 of our top assets, if not the top asset. Without them, we don't exist. And so to be able to allow them to go to New Hampshire, Massachusetts, Connecticut, and the New England region, and get the same expectation of privacy and the same rights, across the board is important to us. And I think it's something that we support. But again, we don't we do not see the need, nor do we want to kind of reinvent the wheel on this issue when it's already been kind of litigated throughout 20 states already. And so, again, I'm happy to work with you on 78 and Yeah. Perfecting that. But there's a concern there.
And again, folks have mentioned it. Those concerns are exacerbated when you have a private interaction. And that's 1 thing I will mention. You you you you asked me about small businesses. The plaintiff's bar, if they do start creating these niche industries to go after these things, and we've seen them whether it's in our, ADA web, webchester laws, whether it's our Wire Act and some of the software that people use, online, or it's, the lie detector, requirement in job applications. There are plaintiff's attorneys that will go out and start filing lawsuits on anybody and everybody that they possibly can. And the calls that I get are not from my big guys. They're I'm getting calls from the Inn on the Cape. I'm getting the call from the single operator on Main Street who has been hit with a lawsuit and now has to hire an attorney and spend 10,000 to 20 thousand dollars just to prove that this is a frivolous lawsuit. And when you include the level of statutory, statutory penalties like this law, the concern is that it makes it more appetizing for someone to come here and do it here in Massachusetts than they would in other places. So that's those are the concerns that we have.
FARLEY-BOUVIER - Sure. So I wanna make a correction, and then, and then note something. When I gave you that story about Scotland a year or 2 ago, I was delighted that they were asking for those. Because I was like, look at17628 this really works. You know? And I thought that was terrific. And it was a way that was easy for me, the consumer, to use.
KEARNEY - Right. And it's my concern17636 is if you went from Ireland into Mainland Europe, and then you came back17640 to the United States, you're gonna have a different standard in each place. And I guess you can expect that when you're going crossing, you know, country lines. But when you're going from, I mean, we have folks that are coming from Massachusetts into New Hampshire and back on a daily basis and all the other 6 New England states. And so to not have that bright line test. I think this is a concern.
FARLEY-BOUVIER - And then the other thing, and I promise, I know you're anxious too, senator, my partner here. The thresholds within h 78 talk about the private right of action does not apply I'm sorry. To the we have a threshold standard so that the true small business, the in on the cape. Right, they're they are not subject to the private right of action.
KEARNEY - Right. So my concern is that the plaintiff's bar will not show discretion as to whether or not someone is above or below that threshold. And so small businesses may get hit with a demand letter, and then their only frivolous lawsuit thrown out, or just to prove to the court that, hey, I'm under the threshold. I should not be called into this. Whereas the attorney general is office or a dedicated government entity to enforcement mechanism would be able and would probably be like more likely to use that discretion and say, hey. Wait a second. This doesn't have to go forward. We don't need anybody spending all this money out of their pot of pocket for a small business.
FARLEY-BOUVIER - Thank you for clarifying,17725 Senator. Thanks for your patience, too.17727
MOORE - Thank you. So you referenced the economic development bill development bill in how Massachusetts 1 of the goal from the legislature17735 and the governor's office is to make Massachusetts, more attractive to industry in bringing, more competitive. More competitive.
KEARNEY - Yeah.
MOORE - Alright. The question has been asked about the privacy bills, and there were 20 other states that have actually enacted legislation regarding this. We've had no data presented to us, and no one's come to us that how these states have been hurt by the passage of these, data privacy. So what what hold on. Okay. So we've had no data come forward on that. The legislation on data minimization allows businesses, even small businesses, to collect the data necessary. So and I'm just gonna talk Senate 45. Senate 45 clearly has an exemption for what you've talked about for loyalty, bona fide loyalty rewards, rewards, premium features, discounts, club cards,17794 and club card programs. So, you're advocating against certain bills. But all's I17804 guess, all's I'm gonna be asked for is that when you're advocating for these that don't distort the bills or the legislation that.
KEARNEY - I apologize if I gave any distortion as to what was going on.
MOORE - Well, you clearly equally said that this that was an issue with the bills that that you your your members have this issue with the loyalty reward cards and the loyalty programs that that could subject them to litigation. So that's all if to make the act I guess, to make the accusation or or inference that we are here trying we are not looking at benefiting our small businesses or the members of your association. This legislation, we unless you can produce some data showing that California, Maryland, Virginia, all these other states that have passed this from with any sort of right of action to a limited right of action, that there's been a detriment to their economy, then I don't think it's fair to be saying that this legislation is going to17863 hurt or make out Massachusetts not competitive.
KEARNEY - So I apologize if that was not clear. There is a cost associated with implementing this program or implementing any framework whatsoever. Right. If you have to implement a Massachusetts-only framework, that's gonna be more expensive than if17881 you were able to just take what you've already built in 20 other states and put it onto the Massachusetts chassis. In addition, there's an ongoing cost that would be created by a private right of action that could result in multiple violations, multiple claims that and that's the ongoing cost that the industry will have to bear in order to comply with the complexity of these laws.
MOORE - But shouldn't the consumer have or have a right to expect the business that they're dealing with to comply with the law and comply with not everyone has a right. It's been talked about all day, but the right of privacy, the right of your personal information. Not being sold on these issues.
KEARNEY - And at the outset of my testimony, I indicated that all of my members have indicated that they will comply with and can comply with whatever framework you throw at them. But to have a Massachusetts law that is gonna cost them more is going to impact its mandated operating cost on the business.
SPEAKER11 - And it
SPEAKER12 - says but Elevated.
MOORE - But there's plenty of laws that Massachusetts has that's different from other states.
KEARNEY- Of course there of course there is. And what our industry and what our we the research that we've done on in general, on business climate in Massachusetts is that Massachusetts is 1 of the worst and costly states to do business in. And when we17954 adopt laws like this, or we adopt a17956 health care mandate, or we adopt an environmental, climate, mandate on utilities, all those go to the cost of doing business. And the concern is that as you add those up in the aggregate, it continues to exacerbate the problem that Massachusetts business has already had.
MOORE- So there's any program or law that's gonna benefit the public health or the consumer or the is is a law we shouldn't pass?
KEARNEY - I didn't say that. I'm saying that
MOORE - Basically, it's saying that it's
KEARNEY - I'm saying that you should be thoughtful about when you when you are making these, decisions, that you should be thoughtful about the additional costs that are gonna be incurred when you do a Massachusetts only style bill versus what we've seen in the rest of the state. Our again, our members understand the concern. And if our members are, if our members are found to be using this information on this data in an inappropriate way, they're gonna lose the trust of their consumers, and18004 they know that. So, there's already an inherent protection there. I understand that there's a sentiment, and there's and I potentially would agree with it that there are bad actors out there. But these, again, these businesses are are doing.
They're using this information again for the benefit of their business purposes, of course, but also for the consumer as well. And so18027 I'm just not quite sure I have an answer for you. But again, we are trying to figure out a way to work with you on this. And I apologize if I have to come off as making it sound like your work is not appreciated. I understand it, but again, it's creating a Massachusetts-only framework that we're gonna have to cover for. We gonna have someone's going to have to pay for that. And so it's you can you can look at the big box stores and say, it's gonna be them. But again, that trickles down to the cost of doing business, which then goes into the prices, which then goes down to the consumer. And so I'm trying to figure out a way to protect the consumer in a way that helps protect them but also allows for businesses to continue to compete with the rest of the country.
MOORE - And we do have, I think, every version has an exemption for true small businesses. So, the consideration of small businesses has been considered in this.
KEARNEY- Yes. So that's all. Thank you. Understood.
FARLEY-BOUVIER - We appreciate it, and we particularly appreciate your offer to continue to work closely with us. Absolutely.
KEARNEY - You guys have been widely receptive. I really appreciate the work you've done.
FARLEY-BOUVIER- We're very earnest in that. And as you know, our timetable is very different this session than it was last, so we hope to hear from you soon.
KEARNEY - Excellent. Thank you.
FARLEY-BOUVIER - Okay? Thank you so much. We're gonna now go to Grace, who is online, and we are gonna follow up with Tasha Adler, who will be on deck. Grace?
SPEAKER15 - Yes. Good evening, chairs. Can you hear me?
SPEAKER1 - Yes. Thank you for your patience.
GRACE GEDYE - CONSUMER REPORTS - (R) Fantastic. Well, thank you all for having this very thorough, comprehensive hearing. I know it's been a very long day for you all and your staff. Honorable members of the committee, my name is Grace Gedye, and I'm a policy analyst at Consumer Reports, where I focus on consumer protection issues in AI policy. Consumer Reports is a nonpartisan nonprofit with 6,000,000 members across the US, and in Massachusetts specifically, we have more than 38,000 members. I'm here to testify regarding House Bill 99, which would prohibit the use of biometric data like fingerprints or prints or retina scans in pricing grocery store items. That is using something like a scan of your iris18156 to determine the price that you specifically will pay for a bundle of bananas.18160 It would be frankly dystopian if grocery stores widely implemented18164 this practice.
Yet, several years ago, companies were already piloting these systems. Kroger announced publicly that I was working with Microsoft to produce electronic shelves for grocery stores that would use video analytics so that, quote, personalized offers and advertisements can be presented based on customer demographics, end quote. As Fast Company further reported, quote, a camera at each display will determine through facial recognition AI, the gender and age of the shopper passing by, end quote. This is creepy. I don't know about you all, but I don't want grocery store shelves spying on me and detecting my age and gender. Prohibiting the use of biometric data in grocery store pricing should be an easy yes. More broadly, we would encourage Massachusetts to tackle the larger problem of personalized pricing, also known as surveillance pricing.
Biometric data is just 1 type of data that retailers can use to offer individualized pricing based on what they think each of us is individually willing to pay. They can also gather data on our location, search history, what we look at, what we cover online, the type of device we're using, and more to make inferences about our income, specific health conditions, how old our kids are, our compulsions, and our interests. And they can use this data to increase prices when, say, a diabetic is particularly desperate to buy insulin syringes or when a person needs to purchase a plane ticket to get to a funeral. We'd encourage the Massachusetts legislature to tackle this broader problem of surveillance pricing, and I look forward to connecting with the author on this specific, more narrowly targeted bill related to grocery store prices and biometric data. Thank you.
FARLEY-BOUVIER - Thank you. Any comments or questions from the committee? Thank18267 you very much, Grace. We're now18269 going to go to Tasha. We're going to Tasha Adler. Hi, Tasha. Welcome, and thank you for your18275 patience.
SPEAKER17 - Go ahead.
TASHA ADLER - BOSTON UNIVERSITY - Hello. My name is Tasha Adler, and I'm a cybersecurity PhD student at Boston University. I study wireless security and privacy, and I'm here to speak in favor of the Massachusetts Data Privacy Act and similar strong comprehensive data privacy legislation. First, I'll say that as a researcher, I have seen up close the granular detailed data18306 that car companies have collected about users, and I am very concerned about it.18312 Second, I'll say that as a student, I've also seen the impact of these data collection policies on students who express themselves online and, political beliefs. I believe that Massachusetts should do everything possible to18329 prevent the transfer or sale of sensitive information to companies that, in turn, sell it to law enforcement agencies. Many of my classmates who have not engaged in any public protests are still afraid to visit their families during summer break because they're scared that they're gonna get blocked from returning because of, say, something on their social media or even something that they, you know, engaged with online.
And, you know, again, these are students that did not18358 participate in protests or do anything controversial. So I've recently moved here from California, you know, 1 of the other states that has a18366 strong data privacy law.18368 And I feel like I've lost some rights coming here almost, and so I would really love to see Massachusetts enact 1 of these policies. So in Massachusetts, it's currently legal for companies to collect unnecessarily specific data regarding location, health history, sexuality, race, and religion, data regarding location, health history, sexuality, race, and religion, all of18390 which can be sold and weaponized as we've heard from18392 previous testimony.18394 Some of the specific instances that disturb me the most18398 are health insurance companies blindly buying client data and using this data to predict the health outcomes of their clients, presumably for the purpose of eventually raising their insurance rates.
I'm also disturbed by the use of data collected on apps that target religious minorities and also the anti abortion advocates that are collecting lists of women who frequented or who visited abortion clinics. Strong comprehensive data privacy legislation would prevent these sorts of abuses by requiring that companies abide by a strictly necessary standard when collecting our sensitive data and demanding that they get our consent every time they want to transfer our personal data. Companies would be required to disclose to us any third parties to which they have transferred our data and the purpose of the transfer. Bans the sale of geolocation data and allows Massachusetts residents to request the correction or deletion of personal data. It allows us to opt out of profiling, which is both dangerous and flawed. Most importantly, I appreciate the private right18464 of action because I think that it's vital for enforcement because the attorney general's budget is limit is limited. Thank you for your time.
FARLEY-BOUVIER - Thank you. Are there any comments or questions from the committee? I have 1. Can you talk more about car data? You said that you get to see this. Can you tell me what that's like? What does my car collect on me?
ADLER - So, while this isn't my specific research project, I can refer you to Northeastern University students who are working with this. I've had the privilege of sitting with them and looking through this data with them. And it depends on the company. Some of the worst are, unsurprisingly, Tesla. And they collect, you know, information like how many people are in the car with you and, you know, the exact the18515 exact speed and, such, that you are apologies. Your speed, when you use the brakes, when you roll down your windows, sometimes even. Like, these are very, very specific pieces of information, and I don't really see what18533 the purpose of collecting them is. No. That's disturbing.
FARLEY-BOUVIER - Thank you for your testimony. Again, here oh. Interesting.
REP HAWKINS - My car tells me when the dogs are in the back seat. Yeah. And how does it know that? Yeah.
FARLEY-BOUVIER - Yeah. So, appreciate your testimony. When we have the voice of students, it really does, it's very meaningful to us. So thank you very, very much. Again, thank you for your patience.
SPEAKER8 - Thank you. Appreciate it.
FARLEY-BOUVIER - Okay. So were you, David? We're going to David Twan's. David here? Yes. No. Next is Christopher Stark.
Thank you. And then we will have Emily Anessa.
CHRISTOPHER STARK - MASSACHUSETTS AND RHODE ISLAND INSURANCE FEDERATION - Chair Farley Bouvier, Chair Moore, members of the committee, my name is Christopher Stark. I'm the executive director of the Mass Insurance Federation. We represent 25 member18585 companies from the property casualty insurance sector. So, we are home, auto, workers'18591 comp, and commercial lines of insurance.18593 It's easier to say what we're not. We're not life, and we're not health. But we pretty much cover everything else. Chair Farley Bovee, I especially wanna thank you for last year taking the time to sit down with us, really dive in, to your legislation last session. I'll submit written testimony and consideration of time, but I wanna hit on some of the issues that we've heard today. I'll start with the entity level exemption versus the information level exemption. 1 of the key concerns that some of our members have is how the definitions work for GLBA.
And so you have a consumer definition and a customer definition for that. But what it won't capture is some of the data that is used for workers' compensation because there, any of the data that's collected on the employee, they're not the customer. That's the protected data under GLBA, or commercial to commercial, interactions where we may be working with the insureds to make some sort of risk reduction measures that they may take dash cams in their vehicle on some commercial vehicles. And how would that actually interact with interact with the text of this legislation without an entity-level exemption? And so, that's why the Federation strongly supports H-eighty as the starting point, mostly because of that use of the entity-level exemption. And so, that is a key priority for us when it comes to these pieces of legislation, any piece of privacy legislation, is making sure that it is an entity level exemption.
And that's particularly important because of the private cause of action. Already, insurers are regulated under the Gramm-Leach-Bliley Act. Massachusetts, chapter H 93, has some of the provisions in this, as well as regulatory authority by both the division and the office of the attorney general's division of insurance within them. And so, that level of concern about what information is or is not, how many of these cases will we now have to oversee, in a new era where we already know that abuse of the legal system is a key factor in the increases, that we've seen over the past 5 years in insurance products. The other aspects, though, that I do wanna touch on are the location shields, as well as the location sections of the data privacy bills. And I think that gets to part of the conversation about the quote unquote patchwork, or the importance of taking a New England approach. When we have specific guidelines for what the data can be collected and specific guidelines on what, presented in terms of consent.
Well, what happens to our telematics devices in a vehicle by the definition, especially in the shield act and individuals, anybody that's even located in the Commonwealth. So you signed an agreement in another state. It may not have been precisely the language that18790 we have in our legislation. And so is the expectation then somehow,18794 that companies invest in the capabilities invest in the capabilities to shut that off immediately when they cross the Massachusetts border? Because there's no ability to get that affirmative consent or that same level of consent, other states in which they sign these agreements, especially for telematics devices, might exist. Similar concern18818 with that, especially on the Shield Act because it18824 talks about subverting choice. And so again, when you have a private right of action attached, what does subverting choice mean? Just telling them that they could get a discount for their car insurance if they place a telematics device impacting their choice, and whether or not they do this.
And then that gets into the retaliatory section of this, again, where it says that you cannot have an impact on price, even a discount. However, the division of insurance requires us that if we're going to use telematics devices, it can only be with18854 a discount. And so, taking the, I think that those are 2 of the key components of, of the discussion today, that I just wanted to highlight how insurance can illustrate why, 1, an an entity level exemption, is necessary, and we're at least not doing it for any nefarious purposes, but more to make sure, that the information that is needed for insurance contracts is able to be garnered even under this law, and then the necessity of having at least some standardization, between us and our, New England neighbors in this economy. Thank you, Chair. Farley, we'll be getting shared more. Sure.
FARLEY-BOUVIER - Thanks. I'm gonna start off by just asking, would your members, by and large, more than 1 you have, but, are they defined as brokers in Vermont or California?
STARK - I'll get back to you on that 1. I'm not sure whether they'd be the brokers or the collectors, typically, of that information under CCPA.
FARLEY-BOUVIER - So it would be it's different. Right? Right. Broker work definitely. And just clarifying, we do not ban the collection of location in our bills. We ban the buying and selling of location precise location data in our bill. That's why it's not the same. Right? And so, as would your members be buying location data?
STARK - It depends on the relationship. And so for 1, definitely would want to have in there at least some ability for the transfer to affiliates, because if they're not within so say, I don't know exactly how it works. I won't use any company in particular. But let's use a telematics device that uses location services as well. If that company and the way they branded that particular device has that as an affiliate of the company, I think that we'd have to do that because then we're talking about the transfers even within the company umbrella that the bill hits on. And so, I'd have to see on that.18978 The other aspect of it is that, yes, sure. Some of my largest national carriers absolutely can afford to purchase18987 or implement18989 a telematics or user-based insurance model on their own. Several have. But18995 some of the smaller carriers, especially, we have a large percentage of domestic small mutuals in Massachusetts. If they wanted to implement a usage-based insurance model, they would probably have to use a vendor in the space. So, in that sense, yes. Even if it is the vendor managing that, there would be a monetary exchange and, therefore, a purchase or a sale of Okay.
STARK- That data.
FARLEY-BOUVIER - I think that makes sense. What you're saying does make sense. So if there's a transfer of information to the agency. That's 1 thing. But do you sell it to McDonald's? That's a really different thing.
STARK - Right. No. I understand.
FARLEY-BOUVIER - How do we capture that is That is what we would have to work on. Right. You know?
STARK - Well, and yeah. And, because it also comes into sort of the other kind of unintended consequences, and it's in my written, but just as a few of the examples of of ways that we are required by law to report some personal information, even sensitive personal information, but Medicaid and Medicare liens, although probably the HIPAA, language covers that. Child support liens, fire loss data reporting, mandates on, reports of stolen vehicles or the salvage base, database that we passed a couple of years ago here, that includes the license plate, information. And so, there's a lot of those areas that we'd have to work on, which is why we do so strongly advocate at the entity level. We wouldn't have to get into kind of having to okay. We gotta insert this particular section and that particular section.
FARLEY-BOUVIER - Well, I appreciate that. Just letting you know that this committee is not averse to working hard to get it right in message. Right, I know. And I appreciate your staff for that as well, Chair. We look forward to hearing from and working with you over the next 60 days. Okay. Sure. Any other questions? Representative Vitolo.
VITOLO - I very much appreciate the opportunity in insurance, particularly auto insurance, with vocational data and helping drivers learn to go a little easier on the gas, a little easier on the brake, a little easier on the steering wheel will all be safer. My concern is data retention. So that data is useful for a period of time, but are you gonna hold it forever? Even if you're not buying it or selling it, the longer you19137 hold it, the more likely someone else gets it, even though you didn't want them to have it. And so so my hope is in working with the chairs about how to make sure we have sensible and appropriate protection for motorists, that you also think about data retention. You know, if 1 insurance company, you know, moves that customer onto an affiliate, well, then that first company doesn't need that data anymore. They should get rid of it. They shouldn't keep it. Right? It should. If you've gotta move it around, don't also hold on to it. Right? And so my hope is that as you think through this, you also think about how to not just only collect the data you need but also only hold it for the amount of time for which it's actually useful for the company and the customer to work out a better insurance rate. And after that, you gotta get rid of it.
STARK - Well, and I'll say 2 fold on that. That's a concern to some level. 1, the telematics, the quote unquote black box data can be very helpful in a lot of claims disputes. And so I think we'd have to be careful about making sure that we preserve, at a minimum, this data for any claims that may be associated with it. And then there is a question. I'll go back to my members on this, but how we kind of are are able to manage that UBI or telematics data in year-over-year comparisons. Now, there may be a way, and again, I'll go back and talk to them about this is, you know, is there a way to deidentify or average some of that data from, you know, initial application at the end? We'll see. But, you know, those are kind of 2 of my initial thoughts that come to mind is, 1, you're going to wanna have19249 something else to compare it to. 2, there is some importance to this data for claims investigations.
REP OUELLETTE- Thank you. Question on telemetrics. Why couldn't you just use a serial number or something and keep the personal data within your systems at the insurance companies? In other words, you're worried about it getting out. Keep it internal in your information, whatever it would be over the airwaves or whatever have you. Why wouldn't you just have a serial number or something that wouldn't have personal data?
STARK - I think we'll have to look into that because I do think that that's part of the complaint that some of the advocates for 8 78, some of the more strenuous requirements19291 have is that even when we work to de identify it, that people may be able to put those pieces back together again. It doesn't mean we19301 don't do anything, but I'm happy to take a look at that and talk to my folks about 1 what that takes. Is it possible? But 2, if somebody got this data through nefarious purposes or with nefarious purposes, would they be able to recreate that as de-identified data? Yeah.
FARLEY-BOUVIER - Yeah. Go ahead.
MOORE - So, 1 quick question I have. So you're talking about whether they shut it off at the state line. Well, I guess, we've seen some states now that are trying to criminalize certain aspects of access to health for women. So now we have someone that travels from a state that criminalizes, comes here, whatever health19347 procedure she perceives. Now she go now she goes back to her estate, and they now look at that location and say, okay. Now we're gonna criminally prosecute her. So you don't. We wouldn't want that. Well, you, exactly. No. I don't think anyone. So there's no way with technology. I mean, technology is being used to always benefit businesses, profits, increasing employees, or whatever the so-called benefit is. But when it comes to protecting someone's privacy or protecting someone's rights, we can't use it to utilize that technology in a way that it's gonna protect those rights.
SPEAKER11 - Well, no.
STARK - I think that we do, you know, even even to a protection level. I can tell you that when it comes to the data protection laws in the state, especially as they govern insurance, the AG's office has more than enough resources to check in on us.
MOORE - But do you see what I'm getting at?
STARK - Yeah. No. Because I do think that there has, you know, I'll go back to my members and see, you know, exactly what I do know that 1 of the challenges that we have with this is, like, the shield act for instance, and just some of the kind of general thoughts out there about this is, We don't know where, you know, any reproductive clinic might be, or LGBT bar, or protest that is going on at any given time, through this. And so, I think that's where, you know, it'd be difficult to make sure, you know, I know some of the, ideas have been about making sure that, you know, in those spaces, location data isn't collected. But, again, that's difficult for us because we're not collecting it for those purposes. So how do you know where all of these, you know, necessary locations may have been as well?
FARLEY-BOUVIER - So just to reiterate, and then, you know, I would need to move on. But to collect it for your purposes is 1 thing. To sell it because you don't know who's gonna use it, right, for something else. You don't know their purposes, and so don't sell it. That's what we're saying. We were banning the sale and purchase of data. That's what H 78 says. Okay. I really do appreciate it, and I hope you will stay in touch with us.
SPEAKER26 - Yeah, absolutely.
SPEAKER28 - Thank you. Thank you.
FARLEY-BOUVIER - We're going to move to Emily Anesta. And I don't know if Dan Daniel Wolfe is still online. If not, then we would then go to Robert Stroke. Hello. Welcome. Hi. Thank you for your patience today.
EMILY ANESTA - BAY STATE BIRTH COALITION - HB 78 - HB 104 - SB 45 - SB 197 - Thank you so much, chairs and members of the committee. As you know, my name is Emily Anesta. I19509 lead an organization called the Bay State Birth Coalition. We're a consumer-led grassroots nonprofit seeking19517 to improve maternal health care, especially access to community-based care, midwives, and birth centers in Massachusetts. Many of you know me as a maternal health advocate, but you may not know that I have a master's and a bachelor's in electrical and computer engineering from Northeastern and WPI, and I spent 20 years working on new research technology and development, both commercial and government-funded. I'm here today to share my support for and my plea for urgency on consumer privacy broadly and banning location data, in particular, the location shield act H 86, S 197, and the comprehensive data privacy bills H 78, H 104, S 29, and S 45. As a resident of Massachusetts, an engineer, a maternal health advocate, a woman, and a mother, I am terrified. Like everyone, I ring my phone everywhere. And as my children get older and have their own devices, I'm worried about their privacy as well. Even with my technical expertise and awareness, I find it impossible to protect my privacy while participating in modern society. Opting out of location tracking, as you know, it's unrealistic. It's impossible.
We use our phones to stay connected, to pay for things, and to do everything, including keeping in touch with our children and vulnerable family members. And self-regulation by industry does not work. It's really incumbent upon governments, and I'm so grateful for the attention you all are paying to this, to put up the guardrails to prevent the exploitation of our most personal details and everything that can be inferred from that data. The technologies and the data exploitations are novel, and the solutions must be as well. In addition to the many important issues raised today, please all also understand that this has important implications for maternal health and for our state's progress that, and wonderful work that's happening here, to improve health outcomes and end racial inequities for mothers and babies.
Privacy fears and threats can protect our most vulnerable mothers from accessing necessary maternal health care, putting birthing people and their pregnancies at risk. People deserve and, in fact, require privacy when they're seeking prenatal care, giving birth, needing emergency care for an alarming pregnancy complication, and receiving essential postpartum care. And this includes immigrant mothers, undocumented mothers, LGBTQ birthing people, people who've been victims of domestic violence and stalking, and birthing people who may have experienced sexual assault. We also have to think about health care providers' own data, which is a concern for them and the families that they care for. This threat impacts our. Oh, I'm sorry. Time's up. Anyway, freedom of movement also reveals a person's pregnancy status. These are all concerns. A novel approach is what's needed to meet the moment, not stale approaches that lag the technology. So thank you.
FARLEY-BOUVIER - Thank you. Thank you. Any comments or questions from the committee? Thank you. I don't know about you. Go ahead.
MESCHINO- You seem to hesitate. What was19722 the other sentence you wanted to share with19724 us? I just wanna hear it. I think it's important that you just brought in a different perspective. And I just thought it was an important piece, and I just wanted to hear what it was you omitted.
ANESTA - Thanks. I was. The other couple of quick things I was gonna say was that data exploitation can reveal a person's pregnancy status, so that can potentially expose this private information to an employer, family member, or abuser. The novel approach is what's needed to meet this moment because the privacy risks are cutting-edge and always evolving. And so we should not be looking to steal solutions. We should be looking to lead in Massachusetts on data privacy and location data in particular to protect maternal health, reproductive justice, and human rights.
FARLEY-BOUVIER - Thanks very much. Thank you. It's really helpful. Thank you. Appreciate it. So just checking, we don't see Daniel Wolf still with us online, so we're gonna move to Robert Stroke. And then, back online with Lee Hefner.
SPEAKER22 - Thank you.
FARLEY-BOUVIER - Hello, and welcome, and thank you for your patience. I feel like you've been here all afternoon. Yes?
SPEAKER29 - I have been. Yeah.
SPEAKER3 - Because it's been fascinating.
BOBBY STROOP - HARVARD LAW STROOP - HB 103 - It's been very interesting. I'm actually I'm here to speak on H 103, an act established in Massachusetts Neural Data Privacy Protection Act. And thank you, Representative Owens, for telling me about this hearing when I was reaching out, actually, about a different tech policy issue. My name is Bobby Stroop. I'm a third-year law student at Harvard Law School, and I'm here in my individual capacity to share what I've learned from my conversations with innovative entrepreneurs and in my own bioethics research. I would consider myself a futurist. I'm constantly thinking about the world in which technologists are going to be developing products and the world in which my baby daughter will grow up. I'm thinking about all of us.And we stand on the shoulders of data privacy advocates going back to the Harvard Law Review article in 1890, the right to privacy.
I brought it up today. And, unsurprisingly, the right has evolved since then. So my work specifically focuses on BCI or brain-computer interfaces, and specifically even within that consumer neurotech. A brain computer interface or BCI is a device that allows a user to control the computer using only their brain activity. And non implantable BCI is a commercial shelf product. It doesn't require surgery. It looks like AirPods. It might be headphones, something along those nature, that nature. And there are incredible capabilities. Duke law professor Nina Farahani wrote in her 02/2023 book, an average tech-savvy person can now see their emotions, arousal, and alertness and track how effectively they are meditating. The Neural Rights Foundation, upon which I've actually built a lot of my own research, although not affiliated, they'd mentioned thought to text, where researchers have seen you can think of specific letters or even whole words and translate that into text.
And generative AI is only going to further the capabilities of neurotech, and investments are in the billions of dollars. This is going to be huge. Representative Vitolo, I believe in you. I believe in all of you. I'm less confident in the federal government to address this issue. Chair Moore,19935 you mentioned. Thank you. You're so sorry. Chair Moore,19939 you mentioned HIPAA earlier. HIPAA doesn't cover consumer neurotech, and that's specific because it only applies to covered entities. That's health plans, health care clearing houses, and health care providers. Additionally, the FDA doesn't cover consumer neurotech because it's not a medical device, and that's only in regard to physical medical safety. To summarize the relevant portion of that act, if it's not medical treatment or advice, it's not a medical device. The health and wellness exception is for low-risk products that promote a healthy lifestyle in which consumer neurotech exists. And essentially, even with discretionary enforcement, the FTC probably can't even write correct rules on data privacy specifically for neurotech and some of the rights that we need. I know that it gets cold in Massachusetts, and I'd frankly rather have a patchwork quilt than nothing at all.
SPEAKER11 - Wow. There you go.
SPEAKER5 - Thank you. Thanks.
FARLEY-BOUVIER - I really appreciate this perspective. Thank you. I really do. I really appreciate your patience today. At least I hope it was, it was, a learning experience for you. I just wanna check with the committee. Senator?
MOORE - If you have some spare time in20007 your studies. If you could actually put something in writing to us absolutely. That would help us20011 as we look at the legislation for any20013 possible changes that may be necessary. But thank you.
SPEAKER22 - Thank you. Representative Cohen?
OWENS - Yeah. Thanks so much for coming out today. I'm sorry that I clued you into this and you had to come all day. So, my condolences and apologies as well, but I hope you learned something. And thank you for coming out. We'll be in touch. Absolutely. Thank you. Representative Mesquite.
MESCHINO- Just because you're a law school student and it's all positive, the thing that keeps coming up is this idea of abuse of courts. And I just wanted on the record that, you know, I'm licensed in 2 different jurisdictions, and you have to have a color blue claim. You have to have, you know, evidence to support it. You have to get past all your motions20054 to dismiss. Then you have to be able to make your case. And so this idea that somehow, consumer protection and class actions are an abuse of the court system, and you just triggered the thought. I was trying not to add to the hearing because it was so late. So I kept that to myself, but you triggered the thought when you were referring to the law review articles and rights to privacy, and this is very much grounded in all of our 60 years of jurisdiction. So, and jurisprudence. So I just wanted to say that on the record, and thank you for reminding me to say it. Thank you.
FARLEY-BOUVIER - Thank you. Anybody else? Great. Thanks again. Appreciate your testimony today.
SPEAKER22 - Thank you for the time.
FARLEY-BOUVIER - We're now going back online is Lee Hepner, with us online. And then we have Kelvin Green.
LEE HEPNER - AMERICAN ECONOMIC LIBERTIES PROJECTS - HB 99 - SB 47 - ( R ) Good evening, Chairs Arlie Bouvier and Moore. Thank you for engaging in a remarkable hearing today. I've been tuning in for most of it, but it's a breadth of work, and I've actually learned quite a bit. My name is Lee Hepner. I'm an antitrust attorney and senior legal counsel at the American Economic Liberties Project. We're a nonprofit, nonpartisan research and advocacy group that seeks to understand what you're doing. Oh, I'm sorry. Was that I just heard an echo there. Can you hear me alright?
FARLEY-BOUVIER - Now it's better. Thank you. We're good. You're good.
HEPNER - So I work for the American Economic Liberties Project. We're a nonprofit and nonpartisan research and advocacy group that seeks to understand the ways in which corporate power is wielded to harm consumers, workers, and small businesses across the country. I'm here to address the committee on bills H 99 and S 47, which pertain to surveillance pricing in grocery stores. Earlier this year, we coauthored a new and fairly exhaustive report on the phenomenon of surveillance pricing. In brief, surveillance pricing refers to the ability of corporations to use vast quantities of personal data to set individualized prices and wages that are different for each person, exploiting consumers based on their unique vulnerabilities and behaviors. And now it is not news, particularly to you all who have been inundated with this information today, that data brokers and corporations now collect information about where we go, what we watch, what we like, what20197 videos our cursors linger over, and what loans we take out.
What is new is this phenomenon of firms running those data points20205 through algorithms to set individualized prices and wages, rigging the market to charge consumers as much as they are willing to pay and to pay workers, who are indeed also consumers, the minimum amount they are willing to work for. So it is a spiral to the top for prices and a spiral to the bottom for wages. Now, this issue recently entered the national political discourse when Kroger announced a20229 plan to install digital price tags and face recognition surveillance in its stores. And you've heard a little bit about that today. The biometric component of this raised a lot of questions. Why would a grocery store need to engage in face surveillance and biometric identification of individual shoppers unless the goal was specifically to exploit that information for some other purpose? Now, this is far from just an in-store issue. In fact, there is substantial evidence that the online applications of surveillance pricing are already pervasive. In, there's a very large grocery delivery app, that recently announced on an earnings call that they had rolled out technology to assess the price sensitivity of individual consumers using the app, and they have the capability then to set different for prices for me buying a banana, than my neighbor buying a banana based on my willingness to pay and my price sensitivity.
So this capability is here. The incentive is clearly here, and I think that the legislation before you in these 2 bills it's clearly aimed at grappling with it. And I would, you know, encourage it's a little bit outside the context of these 2 bills, but to incur to engage on the wage side of this as well. That the fact that, particularly in the gig economy, that different workers are being paid different amounts for the same type of work, for the same task based on how much money they have in their bank20312 accounts, what their credit score is, other financial vulnerabilities that, that they may be, you know, vulnerable to.20320 So this is a really complex issue, but the 2 bills that I'm here to speak on today and indeed a lot of the location-based stuff that you all are grappling with today really inform some real progress. I genuinely looking forward to the progress you guys are all making today. This is tremendous work, and it's been great to listen in.
FARLEY-BOUVIER - Thank you so much. I'm gonna first turn to the committee for any comments or questions. Senator, do you have anything?20344 I do have a question for you.20346 If you would indulge me, please, Mister Hefner. You came for a specific purpose to talk about those bills, and you've, but you've also listened. So do you, with your multistate view. Do you have a view on comprehensive data privacy?
HEPNER - Yeah. Absolutely. I mean, I think that states have a tremendous ability to lead here, particularly on ideas like data minimization, which I20374 think has been a common theme of what you've heard20376 today. This notion is that if data is being collected and we know it's being collected, let's at least minimize the purposes for which it can be used. Minimize the retention of that20386 data so that it's not lingering beyond the use that it was collected for. So, I think that those principles20393 are really fundamental, and what's kind of scary is20397 particularly the era of AI and new models that are are becoming kind of agentic in their own right is the ability for that data to be exploited long term for purposes that were far different than what it was collected for.
Now, I think that certain uses should just be barred. I don't think that we should be collecting personal information to charge people different prices or to pay people different wages. I think that that is just an insidious exploitation of people's personal information. And the only upside is going to be for the large corporations who are seeking that steepability on the backs of consumers and workers. But I think the principles that you have been grappling with all day today are super relevant around your location data, data minimization, comprehensive, privacy policy. I'm based in California. Many of my colleagues are in DC, but we have a very rigorous and really effective comprehensive privacy law here in California that the business community has adapted to. And it's been a few years, but I think it's working.
FARLEY-BOUVIER - Thank you. Appreciate that. Pleasure. We thank you for your testimony, and we'd be happy to continue to engage with you throughout this process.
HEPNER - Thank you so much.
FARLEY-BOUVIER - Sure. So we're next gonna go we're next gonna go to Kelvin Green. Is Kelvin here? And after that is Tom Nyer, who is our last testifier. Anybody else. Or anybody else who's here. Signed up. Okay. Oh, good. I've been watching you, and I was wondering about your name, so that's okay. We're not gonna let anybody not testify. I promise. Especially. Kelvin, it was nice to see you today.
SPEAKER12 - Nice to see you. Yeah.
KELVIN GREEN II - KAIRO FELLOWSHIP - HB 104 - SB 45 - SB 29 - Hello to the chairs, to the committee, to the staff, to the people who've also been waiting. Hello. My name is Kelvin Green II. I am a Malden resident. I also sit here as a believer in Jesus Christ and his call to service. I'm an 8-year resident of Massachusetts. I'm a graduate of MIT. I'm a former community organizer in Somerville, Randolph, Everett, Dorchester, Mattapan, Rosendale, Jamaica Plain, and Woburn. And I'm here as a senior organizing specialist at Kairos Action. We are a national20526 organization focused on non-partisan education and advocacy at the intersection of technology, democracy, and other issues our communities face today.
I believe Massachusetts has always have believed is always well positioned to pioneer and lead the way, especially in 1 of the most pressing issues facing our communities, which is the severe invasion of privacy by corporations and mostly the lack of the legal guardrails preventing them from doing so. This is why I ask each of you to support, and favorably favorably report, support for h 86, s 1 97, the Location Shield Act. I also want to20566 register my support for the following comprehensive privacy bills: H 104, S 45, S 29, and H 78. In the work that I do, I've been a witness to Massachusetts residents who've been doxed. That means their personal and private data has been published online for no other purpose but to bring harm. These community members rightfully feel unsafe in their own communities. They live under threat of online and, unfortunately, offline violence with their data publicly floating in the ether, and many remain traumatized by the fear of such. I like to think of this I'm at Kairos. We like to bring digital issues into in-person reality.
So I tell folks, you know, imagine your home without a front door. Imagine how many people could come in without you knowing. Imagine folks in your home observing you with cameras, reporting back to their collaborators, stealing, plotting, and even selling what they discovered to someone else. This is what our current digital landscape looks like. And so, strong consumer privacy legislation erects a stable door. We just want a front door. With a lock on all of our digital lives to protect us, so We The People can safely live online and offline. This door provides protection and authority to We The People over what happens with our data, whether we want folks in or keeping perceived threats out. So, in order to protect Bay Staters from being the victims of massive breaches of privacy, Massachusetts needs to include these 3 components in its legislation. You've heard this already today, but I will repeat data minimization. A calculator or flashlight app doesn't need to know where I am to work.
So strong privacy legislation needs to ensure companies are aligning their data, collecting practices with what the user is expecting. 2, a ban on the sale of sensitive data. Businesses should not be able to sell sensitive data about us without us, especially when there are cases of folks avoiding care or being denied care. And lastly, strong enforcement, including a private right to action. We know that policies are only as good as their ability to be enforced. There are real people depending on these policies turning into real-world action in order to keep us safe. At Kairos, we know people deserve a say in how our data is used, how the Internet is governed. We appreciate Massachusetts for pioneering legislation, I will say. Maryland has produced something, but they do not have the private right of action, so you would be pioneering this legislation that will make tech work for all. And lastly, I'll just say as someone born in '98, 2 years after the GLBA, I identify with 36% of Massachusetts residents. That's about 2,000,000 people who are under the 29, and we only know the world as deeply digital and offline. So we are excited to for you to have this opportunity to consider us in this world that we've inherited and to provide protection. Thank you.
FARLEY-BOUVIER - Thank you. Thank you for your strong testimony.20739 We appreciate it. Are there any comments or questions from the committee? Perfect.20743 Okay. We're gonna go to Tom Nyer in the room and then Karen's. Thank you. Great. And is there anybody else? Just let us know. Okay? Is Tom here? Is Tom online? Okay. So, Karen, you are up. Hi. And we are delighted with the age. Oh, no. Of course. Everybody here. You had a lot of patience today. Oh, great. That.
KAREN BAUERLE - TOWN OF BELMONT MASSACHUSETTS - This is super important. And I'm really tired. Yeah. I don't wanna just lose it right here for you. Yeah. My name is Karen Bauerle. I'm a resident of Belmont, Massachusetts. I'm a member of town meeting. I represent the Disability Access Commission in our town, and I'm a member of the housing trust. I have also, have retired my law license because it became useless to me because I have multiple chronic illnesses. And I wanna say that it starts out when I got married, when I was 24, I lost my health insurance. I lived in Georgia, and I was put on a computer. I could no longer get health insurance as a result of that single denial. I have run from that my whole life. Wow.
And without a spouse, I would have no protection, except now that I live in Massachusetts. But, I mean, it's really been this has been following me around, and it's actually made me sicker. My mental health has suffered. My children's mental health has suffered. My physical health has suffered. I drive my test results to my doctors, so I don't have to put it online. It's really hard to do when you're chronically ill. Yeah. And I'm not employable. You know? I used to be a person. Now I'm just what? A number? No. Now I'm a person to corporations that wanna sell my information. And there's a great deal of precarity that's very practical and very true. And some of it's in heads and but but most of it's really in our lives. And this needs to be attended to. I would argue for the strongest possible comprehensive data protections available.
Because I deal with health insurance a lot, I am incredibly exposed. My information is spread out on different networks in this state. I have doctors who cannot communicate with each other because they're excluded by other programs. So pertinent information falls due to cracks for me. It I need, like, a team leader. And I am a fortunate person. I'm capable, and I would like it. I'm just gonna go to my next point, which is that I really want my personal financial information made safe. I have no financial information online to the best of my ability. What I have done is withdraw from life as a result of this. And I'm also obviously concerned about biometric data. I don't pay the same price for my groceries as anybody else. There are many other groups for whom it's they're in a much more precarious situation than me. They deserve our deepest consideration. I also would argue for a personal right of action. I personally wouldn't mind being in the trial. Yeah. And I just wanna thank you. I'm gonna go ahead and stop talking. But,
SPEAKER14 - Okay.
FARLEY-BOUVIER - And, you know, your written testimony is very valuable to us, so please do submit that. And I think that your testimony really exemplifies the burden on the consumer and how we have really, you know, we have that backward. We do. So, we appreciate your testimony. I just wanna look around, I feel like we got to everybody.
MOORE- Well, thank you for waiting.
FARLEY-BOUVIER - Thank you. Really do appreciate that. And so 1 more, plea that written testimony, is, encouraged, and, we're asking for that to be submitted by Monday the fourteenth at, 5 PM. Okay? Thank you, everybody. We're gonna call that a close.
© InstaTrac 2025