Something is either public record - in which case it should be on a government website for free, and the AI companies should be free to scrape to their hearts desire...
Or it should be sealed for X years and then public record. Where X might be 1 in cases where you don't want to hurt an ongoing investigation, or 100 if it's someone's private affairs.
Nothing that goes through the courts should be sealed forever.
We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
Open to research yes.
Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.
AI firms have shown themselves to be playing fast and loose with copyrighted works, a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
>”Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.”
1000x this. It’s one thing to have a felony for manslaughter. It’s another to have a felony for drug possession. In either case, if enough time has passed, and they have shown that they are reformed (long employment, life events, etc) then I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
There needs to be a statute of limitations just like there is for reporting the crimes.
What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
Also, courts record charges which are dismissed due to having no evidential basis whatsoever and statements which are deemed to be unreliable or even withdrawn. AI systems, particularly language models aggregating vast corpuses of data, are not always good at making these distinctions.
That is a critical point that AI companies want to remove. _they_ want to be the system of record. Except they _can't_. Which makes me think of LLMs are just really bad cache layers on the world.
> I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
Many countries have solved this with a special background check. In Canada we call this a "vulnerable sector check," [1] and it's usually required for roles such as childcare, education, healthcare, etc. Unlike standard background checks, which do not turn up convictions which have received record suspensions (equivalent to a pardon), these ones do flag cases such as sex offenses, even if a record suspension was issued.
They are only available for vulnerable sectors, you can't ask for one as a convenience store owner vetting a cashier. But if you are employing child care workers in a daycare, you can get them.
This approach balances the need for public safety against the ex-con's need to integrate back into society.
[1] https://rcmp.ca/en/criminal-records/criminal-record-checks/v...
In the UK the equivalent is a DBS (Disclosure and Barring Service) check.
And indeed there are four different levels for that.
That's the reality in my country, and I think most European countries. And I'm very glad it is. The alternative is high recidivism rates because criminals who have served their time are unable access the basic resources they need (jobs, house) to live a normal life.
Then before I give you my business or hire you, I also want to know that you are the kind of person that thinks they have a right to any other person's entire life, so I can hold it against you and prevent you from benefitting from all your other possible virtues and afforts.
So I likewise, require to know everything about you, including things that are none of my business but I just think they are my business and that's what matters. I'll make that call myself.
> I'll make that call myself.
This is why this needs to be regulated.
No one is forcing you to hire formerly incarcerated nannies but you also aren’t entitled to everyone’s life story. I also don’t think this is the issue you’re making it out to be. Anyone who has “gotten in trouble” with kids is on a registry. Violent offenders don’t have their records so easily expunged. I’m curious what this group is (and how big they are) that you’re afraid of.
I also think someone who has suffered a false accusation of that magnitude and fought to be exonerated shouldn’t be forced to suffer further.
What criminal records do you have? Please provide a way to verify. Until then, you cannot be trusted in any capacity.
>That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
Thanks, but I don't want to have violent people working as taxi drivers, pdf files in childcare and fraudsters in the banking system. Especially if somebody decided to not take this information into account.
Good conduct certificates are there for a reason -- you ask the faceless bureaucrat to give you one for the narrow purpose and it's a binary result that you bring back to the employer.
> pdf files
Please don't unnecessarily censor yourself for the benefit of large social media companies.
We can say pedophile here. We should be able to say pedophile anywhere. Pre-compliance to censorship is far worse than speaking plainly about these things, especially if you are using a homophone to mean the same thing.
I actually find this amusing and do it because I like to. We are witnessing the new tabooed word, where the usual sacrilege doesn't hit the nerve anymore.
> Not expunged or removed from record, just removed from any decision making.
This made me pause. It seems to me that if something is not meant to inform decision making, then why does a record of it need to persist?
If someone is charged with and found innocent of a crime, you can't just remove that record. If someone else later finds an account of them being accused, they need a way to credibly assert that they were found innocent. Alternately if they are convicted and served their sentence, they might need to prove that in the future.
Sometimes people are unfairly ostracized for their past, but I think a policy of deleting records will do more harm than good.
Or in the case of, down the road, repeating an offense. The judge sees you had an issue in the past, was good for a while, then repeated, suggesting an event or something has happened or that the individual has lost their motivation to stay reformed. Sentencing to time for the crime but then also being able to assist the individual in finding help to get them back on track. We have the systems in place to do this, we just don’t.
Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
The only way to have a system like that is to keep records, permanently, but decision making is limited.
> Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
Should it though? You can buy a piece of real estate without living there, e.g. because it's a rental property, or maybe the school is announced to be shutting down even though it hasn't yet. And in general this should have nothing to do with the bank; why should they care that somebody wants to buy a house they're not allowed to be in?
Stop trying to get corporations to be the police. They're stupendously bad at it and it deprives people of the recourse they would have if the government was making the same mistake directly.
Yeah I agree, a corporation should not only not care, they should be actively prevented from being allowed to make discriminations base on anything outside of whether they can pay or not. If they sense a potential other problem, at worst it should be reported to police or some other governmental authority, it simply isn't their business otherwise.
To me any other viewpoint inevitably leads to abuse of one group or class or subset of society or another. If they are legally allowed to discriminate in some ways, they will seek to discriminate in others, both in trying to influence law changes to their benefit and in skirting the law when it is convenient and profitable.
>Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
I'm not sure we can write that much more COBOL.
> If someone else later finds an account of them being accused, they need a way to credibly assert that they were found innocent.
At the heart of Western criminal law is the principle: You are presumed innocent unless proven guilty.
Western systems do not formally declare someone "innocent".
A trial can result in two outcomes: Guilty or Not Guilty (acquittal). Note that the latter does not mean the person was proven innocent.
> If someone is charged with and found innocent of a crime, you can't just remove that record. If someone else later finds an account of them being accused, they need a way to credibly assert that they were found innocent.
Couldn't they just point to the court system's computer showing zero convictions? If it shows guilty verdicts then showing none is already proof there are none.
Nobody is found innocent in UK courts.
You are found Guilty or confirmed you continue to be Not Guilty.
In Scotland there was also the verdict "not proven" but that's no longer the case for new trials
That seems compatible with OP's suggestion, just with X being a large value like 100 years, so sensitive information is only published about dead people.
At some point, personal information becomes history, and we stop caring about protecting the owner's privacy. The only thing we can disagree on is how long that takes.
Right, except there are some cases where that information should be disclosed prior to their death. Sensitive positions, dealing with child care, etc. but those are specific circumstances that can go through a specific channel. Like we did with background checks. Now, AI is in charge and ANY record in ANY system is flagged. Whether it’s for a rental application, or a job, or a credit card.
> There needs to be a statute of limitations just like there is for reporting the crimes.
The UK does not have a statute of limitations
No? So I can report a petty theft from 35 years ago?
The UK has multiple legal systems.
https://en.wikipedia.org/wiki/Limitation_Act_1980
Applies to England and Wales, I believe there are similar ones for Scotland and NI
The AI should decide if it's still relevant or not. People should fully understand that their actions reflect their character and this should influence them to always do the right thing.
> People should fully understand that their actions reflect their character
As if "character" was some kind of immutable attribute you are born with.
The way you are raised has a big impact on how you act the rest of your life. There is signal in this information, not just noise.
It mostly is. People really do have personalities.
Except it will inevitably lead to discrimination and abuse as it always has in the past. How much of the US justice system is based on harassing poor communities using that kind of excuse? Even if one community is actually less likely to commit crimes than another, if you send 90% of your policing forces there, using the excuse they it is just the way they are and things don't change much, you will find 90% of your crimes there. Even if twice as many unresolved crimes are happening in the the other area.
I actually think it's good to be able to discriminate against people with bad character.
If you can know the character of individual people, you have less reason to discriminate against those from statistically higher criminal communities.
> The AI should decide
That is a great recipe for systematic discrimination.
The whole goal of a hiring pipeline is to create a system to discriminate from a ton of candidates to good people to hire.
I find this a weird take. Are you saying you _want_ unaccountable and profit driven third party companies to become quasi-judicial arbiters of justice?
I am saying that it is not good if you had to hide information about yourself in order to get hired. Justice is provided by the courts.
> Justice is provided by the courts.
Indeed. And as far as I know, "courts" is not an alternative spelling of "AI".
The scenario we are talking about is AI having access to court records. The courts make rulings and then the AI works off of those.
So you’re all for photos on job applications then? That’s information about you.
Yes, I think that could be effective. The way people dress is correlated to the group of people they associate with which influences how they think and behave. If someone properly grooms themselves or not for such a picture provides more signal for the AI to pick up on.
Right, and the court decides wether that information is relevant anymore. You're suggesting we take that ability and give it to random third parties
> What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
I'd go further and say a lot of charges and convictions shouldn't be a matter of public record that everyone can look up in the first place, at least not with a trivial index. File the court judgement and other documentation under a case number, ban reindexing by third parties (AI scrapers, "background check" services) entirely. That way, anyone interested can still go and review court judgements for glaring issues, but a "pissed on a patrol car" conviction won't hinder that person's employment perspectives forever.
In Germany for example, we have something called the Führungszeugnis - a certificate by the government showing that you haven't been convicted of a crime that warranted more than three months of imprisonment or the equivalent in monthly earning as a financial fine. Most employers don't even request that, only employers in security-sensitive environments, public service or anything to do with children (the latter get a certificate also including a bunch of sex pest crimes in the query).
France has a similar system to the German Führungszeugnis. Our criminal record (casier judiciaire) has 3 tiers: B1 (full record, only accessible by judges), B2 (accessible by some employers like government or childcare), and B3 (only serious convictions, the only one you can request yourself). Most employers never see anything. It works fine, recidivism stays manageable, and people actually get second chances. The US system of making everything googleable forever is just setting people up to fail.
The UK has common law: the outcomes of previous court cases and the arguments therein determine what the law is. It’s important that court records be public then, because otherwise there’s no way to tell what the law is.
It is the outcome of appellate court cases and arguments that determine law in common law jurisdictions, not the output of trial courts. Telling what the law is in a common law system would not be affected if trial court records were unavailable to the public. You only actually need appellate court records publicly available for determining the law.
The appellate court records would contain information from the trial court records, but most of the identifying information of the parties could be redacted.
There are middle grounds - for example you could redact any PII before publishing it.
That's what Ukraine does, but I guess we have more resources to keep the digital stuff running properly and not outsource it to shadycorp.
It should be possible to redact names from cases for that purpose.
It should be possible to leverage previous case law without PII.
> It’s important that court records be public then, because otherwise there’s no way to tell what the law is.
So anyone who is interested in determining if a specific behavior runs afoul of the law not just has to read through the law itself (which is, "thanks" to being a centuries old tradition, very hard to read) but also wade through court cases from in the worst case (very old laws dating to before the founding of the US) two countries.
Frankly, that system is braindead. It worked back when it was designed as the body of law was very small - but today it's infeasible for any single human without the aid of sophisticated research tools.
You are correct which is why I recently built such a tool. Well, an evidence management tool.
The premise here is, during an investigation, a suspect might have priors, might have digital evidence, might have edge connections to the case. Use the platform and AI to find them, if they exist.
What it doesn’t do: “Check this video and see if this person is breaking the law”.
What it does do: “Analyze this persons photos and track their movements, see if they intersect with Suspect B, or if suspect B shows up in any photos or video.”
It does a lot more than that but you get the idea…
The interpretation of the law is up to the courts. The enforcement of it is up to the executive. The concept of the law is up to Congress. That’s how this is supposed to work.
That can be solved by migrating to a sensible legal system instead.
[flagged]
[flagged]
We didn't? It must be a small minority of countries that dole out the same punishment for both.
- [deleted]
This is probably not the place for this discussion, good luck
[flagged]
> Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions?
You're conflating two distinct issues - access to information, and making bad decisions based on that information. Blocking access to the information is the wrong way to deal with this problem.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
> Blocking access to the information is the wrong way to deal with this problem.
Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had. It's perfectly valid to say that we want information accessible, but not accessible over the internet or in AI datasets. What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
> Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had.
If all you care about is preventing the information from being abused, preventing it from being used is a great option. This has significant negative side effects though. For court cases it means a lack of accountability for the justice system, excessive speculation in the court of public opinion, social stigma and innuendo, and the use of inappropriate proxies in lieu of good data.
The fact that the access speedbump which supposedly worked in the past is no longer good enough is proof that an access speedbump is not a good way to do it. Let's say we block internet access but keep in person records access in place. What's to stop Google or anyone else from hiring a person to go visit the brick and mortar repositories to get the data exactly the same way they sent cars to map all the streets? Anything that makes it hard for giant companies is going to make it hard for the common person. And why are we making the assumption that AI training on this data is a net social ill? While we can certainly imagine abuses, it's not hard to imagine real benefits today, nonetheless unforeseen benefits someone more clever than us will come up with in the future.
> What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
We've been dealing with people making bad decisions from data forever. As an example, there was red lining where institutions would refuse to sell homes or guarantee loans for minorities. Sometimes they would use computer models which didn't track skin color but had some proxy for it. At the end of the day you can't stop this problem by trying to hide what race people are. You need to explicitly ban that behavior. And we did. Institutions that attempt it are vulnerable to both investigation by government agencies and liability to civil suit from their victims. It's not perfect, there are still abuses, but it's so much better than if we all just closed our eyes and pretended that if the data were harder to get the discrimination wouldn't happen.
If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them. If an AI rejects my loan application, you better be able to prove that the AI isn't doing so based on my skin color. If you can do that, you should also be able to prove it's not doing so based off an expunged record. If evidence comes out that the AI has been using such data to come to such decisions, those who made it and those who employ it should be liable for damages, and depending on factors like intent, adherence to best practices, and severity potentially face criminal prosecution. Basically AI should be treated exactly the same as a human using the same data to come to the same conclusion.
> The fact that the access speedbump which supposedly worked in the past is no longer good enough is proof that an access speedbump is not a good way to do it.
It worked well enough for a pretty long time. No solution can be expected to work forever, we just need to modify the restrictions on criminal histories to keep up with the times. It's perfectly normal to have to reassess and make adjustments to access controls over time, not only because of technology changes, but also to take into account new problems with the use/misuse of the data being restricted and our changing values and expectations for how that data should be used and accessed.
> If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them.
I think we'd have much better success restricting access to the data than handing it out freely and trying to regulate what everyone everywhere does with that data after they already have it. AI in particular will be very hard to regulate (as much as I agree that transparent/auditable systems are what we want), and I don't expect we'd have much success regulating what companies do behind closed doors or force them to be transparent about their use of AI
We both agree that companies should be held liable for the discriminatory outcomes of their hiring practices no matter if they use AI or not. The responsibility should always fall on the company and humans running the show no matter what their tools/processes are since they decide which to use and how to use them.
We also agree that discrimination itself should be outlawed, but that remains an unsolved problem since detection and enforcement are extremely difficult. It's easier to limit the opportunity to discriminate than try to catch companies in the act. You mention that hiding people's race doesn't work, but that's actually being explored as a means to avoid bias in hiring. For example, stripping names and addresses (which can hint at race) before passing resumes to algorithms seems like it could help reduce unintentional discrimination.
Ultimately, there'll always be opportunities for a bigot to discriminate in the hiring process but I think we can use a multifaceted approach to limit those opportunities and hopefully force them to act more explicitly making deliberate discrimination a little easier to catch.
> Blocking access to the information is the wrong way to deal with this problem.
That's an assertion, but what's your reasoning?
> This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
All across the EU, that information would be available immediately to journalists under exemptions for the Processing of Personal Data Solely for Journalistic Purposes, but would be simultaneously unlawful for any AI company to process for any other purposes (unless they had another legal basis like a Government contract).
"court records are public forever" and "records of crimes expunged after X years" are incompatible.
Instead, we should make it illegal to discriminate based on criminal conviction history. Just like it is currently illegal to discriminate based on race or religion. That data should not be illegal to know, but illegal to use to make most decisions relating to that person.
Even if made illegal, how does enforcement occur? The United States, at least, is notorious for HR being extremely opaque regarding hiring decisions.
Then there's cases like Japan, where not only companies, but also landlords, will make people answer a question like: "have you ever been part of an anti-social organization or committed a crime?" If you don't answer truthfully, that is a legal reason to reject you. If you answer truthfully, then you will never get a job (or housing) again.
Of course, there is a whole world outside of the United States and Japan. But these are the two countries I have experience dealing with.
The founders of modern nation-states made huge advancements with written constitutions and uniformity of laws, but in the convenience of the rule of law it is often missed that the rule of law is not necessarily the prevalence of justice.
The question a people must ask themselves: we are a nation of laws, but are we a nation of justice?
Seems like a false dichotomy. You can be both, based on how you apply the laws.
The parent comment is not presenting a false dichotomy but is making precisely the point that it is how you apply the laws that matter; that just having laws is not enough.
Jesus ... that gives me a new perspective on Japan ...
The situation in the US is significantly worse, and probably numerous other countries I haven't experience with. Rather than asking if you've committed a crime, American employers/landlords will do a background check and are liable to turn you down if you've ever been arrested, even if the charges were dropped or you were found not guilty. Comparatively, the reason Japanese employers/landlords may even ask about having committed crime is because they can't find that information on their own freely. This is a fairly ridiculous criticsm, if you ask me. Nobody in any country wants to associate with criminals, often to an unfairly punitive degree, but at least in Japan you are not punished merely for being arrested. And while I don't doubt it happens, it's also far from a universal experience, despite Westerners loving to talk about Japan in broad sweeping generalizations. I have personally never been asked whether I've committed a crime.
One of the ways they keep crime so low. Being convicted destroys your reputation in a country where reputation is extremely important. Everyone loves saying it would be great to have lower crimes like Japan, but very few would really want the system that achieving that requires.
Their system seems to work better for them than our system does for us, so...
>Instead, we should make it illegal to discriminate based on criminal conviction history
Absolutely not. I'm not saying every crime should disqualify you from every job but convictions are really a government officialized account of your behavior. Knowing a person has trouble controlling their impulses leading to aggrevated assault or something very much tells you they won't be good for certain roles. As a business you are liable for what your employees do it's in both your interests and your customers interests not to create dangerous situations.
This is an extremely thorny question. Not allowing some kind of blank slate makes rehabilitation extremely difficult, and it is almost certainly a very expensive net social negative to exclude someone from society permanently, all the way up to their death at (say) 70, for something they did at 18. There is already a legal requirement to ignore "spent" convictions in some circumstances.
However, there's also jobs which legally require enhanced vetting checks.
> However, there's also jobs which legally require enhanced vetting checks.
I think the solution there is to restrict access and limit application to only what's relevant to the job. If someone wants to be a daycare worker, the employer should be able to submit a background check to the justice system who could decide that the drug possession arrest 20 years ago shouldn't reasonably have an impact on the candidate's ability to perform the job, while a history of child sex offenses would. Employers would only get a pass/fail back.
>If someone wants to be a daycare worker, the employer should be able to submit a background check to the justice system who could decide that the drug possession arrest 20 years ago shouldn't reasonably have an impact on the candidate's ability to perform the job, while a history of child sex offenses would. Employers would only get a pass/fail back.
Welcome to the world of certificates of the good conduct and criminal record extracts:
Other people have rights like freedom of association. If you’re hell-bent on violating that, consider the second-order effects. What is the net social negative when non-criminals freely avoid working in industries in which criminals tend to be qualified to work?
What do you mean by that?
Assuming you're asking in good faith, the parent could be referring to the 'market for lemons' in employment, where in lieu of being able to easily determine worker quality, employers start using second- or third-order- proxies for questions about, say, a candidate's likelihood of having a criminal record.
Or, you might just be doing the meme: https://x.com/MillennialWoes/status/1893134391322308918?s=20
> "court records are public forever" and "records of crimes expunged after X years" are incompatible.
Exactly. One option is for the person themselves to be able to ask for a LIMITED copy of their criminal history, which is otherwise kept private, but no one else.
This way it remains private, the HR cannot force the applicant to provide a detailed copy of their criminal history and discriminate based on it, they can only get a generic document from the court via Mr Doe that says, "Mr Doe is currently eligible to be employed as a financial advisor" or "Mr Doe is currently ineligible to be employed as a school teacher".
Ideally it should also be encrypted by the company's public key and then digitally signed by the court. This way, if it gets leaked, there's no way to prove its authenticity to a third party without at least outing the company as the source.
Are you suggesting that I cannot refuse to hire a bookkeeper that has multiple convictions for embezzlement?
If you embezzled money at your last company, I shouldn't be able to decline to hire you on my finance team on that basis?
In many sane countries, companies can ask you to provide a legal certificate that you did not commit X category of crime. This certificate will then either say that you did not do any crimes in that category, or it will say that you did commit one or more of them. The exact crimes aren't mentioned.
Coincidentally these same countries tend to have a much much lower recidivism rate than other countries.
Everything should remain absolutely private until after conviction.
And only released if it's in the public interest. I'd be very very strict here.
I'm a bit weird here though. I basically think the criminal justice system is very harsh.
Except when it comes to driving. With driving, at least in America, our laws are a joke. You can have multiple at fault accidents and keep your license.
DUI, keep your license.
Run into someone because watching Football is more important than operating a giant vehicle, whatever you might get a ticket.
I'd be quick to strip licenses over accidents and if you drive without a license and hit someone it's mandatory jail time. No exceptions.
By far the most dangerous thing in most American cities is driving. One clown on fan duel while he should be focusing on driving can instantly ruin dozens of lives.
But we treat driving as this sacred right. Why are car immobilizers even a thing?
No, you can not safely operate a vehicle. Go buy a bike.
Arrests being a matter of public record are a check on the government's ability to make people just disappear.
But the Internet's memory means that something being public at time t1 means it will also be public at all times after t1.
You can have custody information be open for query without exposing all of the circumstances, and without releasing mugshots to private sites that will extort people to have them taken down.
You can do something very simple like having a system that just lists if a person is - at that moment - in government custody. After release, there need not be an open record since the need to show if that person is currently in custody is over.
As an aside, the past few months have proven that the US government very much does not respect that reasoning. There are countless stories of people being taken and driven around for hours and questioned with no public paper trail at all.
They can disappear you indefinitely regardless.
Democrats love it too.
They call em Jump Outs. Historically the so called constitution has been worth less than craft paper. From FDRs executive order 9066 to today, you have no rights.
There is an entire world where arrests are not a matter of the public record and where people don't get disappeared by the government. And then there is US where it is a matter of public record and (waves hand at the things happening).
So here in the U.S., the Karen Read trial recently occupied two years of news cycles— convicted of a lesser crime on retrial.
Is the position that everyone who experienced that coverage, wrote about it in any forum, or attended, must wipe all trace of it clean, for “reasons”? The defendant has sole ownership of public facts? Really!? Would the ends of justice have been better served by sealed records and a closed courtroom? Would have been a very different event.
Courts are accustomed to balancing interests, but since the public usually is not a direct participant they get short shrift. Judges may find it inconvenient to be scrutinized, but that’s the ultimate and only true source of their legitimacy in a democratic system.
Let's say a cop kills somebody in your neighborhood. Some witnesses say it looked like murder to them, but per your wishes the government doesn't say who the cop was and publishes no details about the crime.. for two years, when they then say they cop was found not guilty. And as per your wishes again, even then they won't say anything about the alleged crime, and never will. Is this a recipe for public trust in their government?
Making the laws apply to the police the same as other citizens is, at least in the US, unlikely.
To be this brings in another question when the discussion should be focused on to what extent general records should be open.
It is also possible to apply a higher standard to the government employees and force greater transparency on them, up to treating them as de-facto slaves of the society.
Yeah okay, different standard just for government employees... So consider the same scenario above except instead of a cop its the son of a politician or the nephew of a billionaire. Not government employees. Are you comfortable with the government running secret trials for them too? Are you confident that the system can provide fair and impartial judgments for such people when nobody is allowed to check their work?
Do you see a lot of billionaries and their nephews in the public trials right now? The one which definitely didn't kill the insurance ceo is going pretty good, judging from all the paid shilling on *grams and such.
Now for a serious answer, what happens in practice in Europe is not secret trials, because trials are very much public. Since there is only so many billionaries, their nephews, actual mafiosi and people with political exposure prosecution, the journalists would monitor them closely, but will not be there on a hearing about your co-workers (alleged) wife-beating activities.
It's all reported, surname redacted (or not, it depends), but we all know who this is about anyways. "Court records says that a head of department at a government institution REDACTED1 was detained Monday, according to the public information, the arrests happened at the Fiscal service and the position of the department head is occupied by Evhen Sraka".
What matters when this is happens is not the exact PII of the person anyways. I don't care which exact nephew of which billionarie managed to bribe the cops in the end, but the fact that it happened or not.
Rank and file cops aren't that interesting by the way, unless it's a systemic issue, because the violence threshold is tuned down anyway -- nobody does a routine traffic stop geared for occupational army activities.
Like everything, privacy is not an absolute right and is balanced against all other rights and what you describe fits the definition of a legitimate public interest, which reduces the privacy of certain people (due to their position) by default and can be applied ad-hoc as well.
You'd need so many exceptions to such a law it would be leakier than a sieve. It sounds like a fine idea at ten thousand feet but it immediately breaks down when you get into the nitty gritty of what crimes and what what jobs we're talking about.
Problem is it's very hard to prove what factors were used in a decision. Person A has a minor criminal record, person B does not? You can just say "B was more qualified" and as long as there's some halfway credible basis for that nothing can really be done. Only if one can demonstrate a clear pattern of behavior might a claim of discrimination go anywhere.
If a conviction is something minor enough that might be expungable, it should be private until that time comes. If the convicted person hasn't met the conditions for expungement, make it part of the public record, otherwise delete all history of it.
> You can just say "B was more qualified"
Sometimes can you can't prove B was more qualified, but you can always claim some BS like "B was a better fit for our company culture"
Curious, why should conviction history not be a factor? I could see the argument that previous convictions could indicate a lack of commitment to no longer committing crimes.
I couldn't parse the intended meaning from "lack of commitment to no longer commiting crimes"), so here's a response that just answers the question raised.
Do you regard the justice system as a method of rehabilitating offenders and returning them to try to be productive members of society, or do you consider it to be a system for punishment? If the latter, is it Just for society to punish somebody for the rest of their life for a crime, even if the criminal justice considers them safe to release into society?
Is there anything but a negative consequence for allowing a spent conviction to limit people's ability to work, or to own/rent a home? We have carve-outs for sensitive positions (e.g. working with children/vulnerable adults)
Consider what you would do in that position if you had genuinely turned a corner but were denied access to jobs you're qualified for?
The short answer is that it's up to a judge to decide that, up to the law what it's based on and up to the people what the law is.
Sure there is still some leeway between only letting a judge decide the punishment and full on mob rule, but it's not a slippery slope fallacy when the slope is actually slippy.
It's fairly easy to abuse the leeway to discriminate to exclude political dissidents for instance.
Because we as a society decided it creates externalities we don't want to deal with. With a list of exceptions where it actually is important because risk-reward balance is too much.
We as a society have decided no such thing, it is in fact legal to refuse somebody a job for having a criminal history, and will remain so.
that depends on a society, right?
Discrimination could be very hard to prove in practice.
> Instead, we should make it illegal to discriminate based on criminal conviction history.
Good luck proving it when it happens. We haven't even managed to stop discrimination based on race and religion, and that problem has only gotten worse as HR departments started using AI which conveniently acts as a shield to protect them.
Which is why in any country where criminal history is considered discrimination, this information is simply not provided. Because these companies have learned over the years that "please don't do X" just doesn't work with corporations.
right, for example someone convicted of killing their parents should fit right into an elderly care home staff team and convicted child rapists should not be barred from working in an elementary school, protecting honest and innocent people from criminals is basically the same thing as racism!
it's hilarious that "people" downvote comments pointing out the logical conclusion of the policies they defend
wouldn't making it illegal to discriminate based on criminal records prevent an elementary school of refusing to employ a candidate that is "fit for the job" (graduated from a good university, has years of experience in the field, etc) who just happens to have a child rape conviction on the basis that he has a child rape conviction? doesn't 1 + 1 equal 2?
The actions of the government should always be publicly observable. This is what keeps it accountable. The fear that a person might be unfairly treated due to a long past indiscretion does not outweigh the public's right to observe and hold the government to account.
Alternatively consider that you are assuming the worst behavior of the public and the best behavior of the government if you support this and it should be obvious the dangerous position this creates.
Thanks, it’s super refreshing to hear this take. I fear where we are headed.
I robbed a drug dealer some odd 15 years ago while strung out. No excuses, but I paid my debt (4-11 years in state max, did min) yet I still feel like a have this weight I can’t shake.
I have worked for almost the whole time, am no longer on parole or probation. Paid all fines. I honestly felt terrible for what I did.
At the time I had a promising career and a secret clearance. I still work in tech as a 1099 making way less than I should. But that is life.
What does a background check matter when the first 20 links on Google is about me committing a robbery with a gun?
Edit: mine is an extreme and violent case. But I humbly believe, to my benefit surely, that once I paid all debts it should be done. That is what the 8+ years of parole/probation/counseling was for.
What we do here in sweden is that you can ask the courts for any court document (unless it is confidential for some reason).
But the courts are allowed to do it conditionally, so a common condition if you ask for a lot of cases is to condition it to redact any PII before making the data searchable. Having the effect that people that actually care and know what to look for, can find information. But you can't randomly just search for someone and see what you get.
There is also a second registry separate from the courts that used to keep track of people that have been convicted during the last n years that is used for backgrounds checks etc.
Fully agree. The AI companies have broken the basic pacts of public open data. Their ignoring of robots.txt files is but one example of their lack of regard. With the open commons being quickly pillaged we’ll end up in a “community member access only model”. A shift from grab any books here you like just get them back in a month; to you’ll need to register as a library member before you can borrow. I see that’s where we’ll end up. Public blogs and websites will suffer and respond first is my prediction.
The names of minors should never be released in public (with a handful of exceptions).
But why shouldn't a 19 year old shoplifter have that on their public record? Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?
Would you want the first thing to show up after somebody googles your name to be an accusation for improper conduct around a child? In theory, people could dig deeper and find out you won in court and were acquitted, but people here should know that nobody ever reads the article...
If you were hiring a childminder for your kids, would you want to know that they had 6 accusations for improper conduct around children in 6 different court cases - even if those were all acquittals?
As a parent, I would want to know everything about anyone who's going to be around my children in any capacity. That doesn't mean I have a right to it, though.
>openly admits his beliefs results in parents not making good decisions on who to allow near their children, keeps going anyway
great moral system you have there
That's a bad faith take.
In one comment you managed to violate a whole bunch of the HN commenting guidelines.
how else would you interpret admitting you don't think parents should have a right to know the backgrounds of the people with access to their children before making informed decisions on whether or not to allow it?
please, show me your good faith interpretation and i will take back my comment
The UK has an official system [1] for checking whether people should be allowed to work with vulnerable people.
[1] https://en.wikipedia.org/wiki/Disclosure_and_Barring_Service
If it was reported in a newspaper then that would likely already be the case.
> Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?
Yes
It is the UK we're talking about after all...
Where the accused have rights too?
Where the journalists have very little rights, and people posting their bad (wrong) ideas (think) even less so.
Not according to the wpfi world press freedom index where it is ranked 20th.
Where speaking truth isn’t a right or a defense
Most media in Europe are required to ambiguate names of criminals. For instance by removing the first name or the last name.
If you prohibit the punishment of minors, you create an incentive for criminals to exploit minors.
Why are we protecting criminals, just because they are minors? Protect victims, not criminals.
Unfortunately reputational damage is part of the punishment (I have a criminal record), but maybe it's moronic to create a class of people who can avoid meaningful punishment for crimes?
> If you prohibit the punishment of minors, you create an incentive for criminals to exploit minors.
This - nearly all drug deliveries in my town are done by 15 years olds on overpowered electric bikes. Same with most shoplifting. The real criminals just recruit the schoolchildren to do the work because they know schoolchildren rarely get punishment.
We protect minors because they are children, and they are allowed to make mistakes.
At a certain point, we say someone is an adult and fully responsible for their actions, because “that’s who they are”.
It’s not entirely nuanced—and in the US, at least, we charge children as adults all the time—but it’s understandable.
But you create an incentive for organized crime to recruit youth to commit crimes and not have to suffer the consequences.
At a certain point, poorly thought out "protections", turn into a system that protects organized crime, because criminals aren't as stupid as lawmakers, and exploit the system.
There is a big difference between making a mistake as a kid that lands you in trouble, and working as a underling for organized crime to commit robberies, drug deals, and violent crime, and not having to face responsibility for their actions.
The legal system has so many loopholes for youth, for certain groups, that the law is no longer fair, and that is its own problem, contributing to the decline of public trust.
> working as a underling for organized crime to commit robberies, drug deals, and violent crime
Have you ever considered that these children are victims of organized crime? That they aren't capable of understanding the consequences of what they're doing and that they're being manipulated by adult criminals?
The problem here isn't the lack of long term consequences for kids.
I used to be a drug dealer so I know what is going on and they aren’t victims, they are willing recruits.
12 year olds know it’s not right to sell crack.
The problem is the gap between lack of legal opportunities for youth and the allure of easy money, status and power from being a criminal. Doesn’t help that the media makes it look so fun and cool to be a gangster.
What's the alternative? A 14 year old steals a pack of gum, and he's listed as a shoplifter for the rest of his life?
Just because exceptions are exploitable, doesn't mean we should just scrap all the exceptions. Why not improve the wording and try to work around the exceptions?
If you don't think this crime is a big deal, then why do you think this crime would matter if it was in the public record tied to their name? These two ideas you have are not compatible.
I don't think stealing a pack of gum at 14 years old is a big deal, but many people have a huge problem understanding proportionality: To them, it's binary. You're either a criminal or not a criminal, and if this kid's record shows "shoplifter" until he dies, a significant number of people, including employers, will lump him into the "criminal" bucket for the rest of his life.
And what about the kids who get recruited for gang activity and do some pretty messed up stuff as kids? Should that not appear on a public record? This is where the problem lies, you essentially can only ever make it an all or nothing approach as it gets a lot harder to determine what should or shouldn't be apart of a public record. Especially since as you reflected in your comment, this becomes and opinion thing on whether someone thinks it matters or not what crime they did as a kid.
The problem that is happening in most Western countries is that criminal organizations take advantage of the fact that minors get reduced sentenced and that their criminal records are usually kept sealed (unless tried as an adult). Whether it be having them steal cars, partake in organized shoplifting operations, muggings, gang activity, drug dealing, etc...
Your reasoning for why this information shouldn't be public record seems to boil down to the fact that you don't agree with other peoples judgement of someone's past crimes. You'd like to see more forgiveness, and you don't think others will show the same forgiveness, so you want to take away all the access to information because of that. To me that seems like a view from a point of moral superiority.
I'd rather people get access to this information and be able to use their own brains to determine whether they want that person working there. If you were involved in shoplifting at 17 years old, and turn 18, I think it would be very fair for a store owner to be able to use that information to judge you when making a hiring decision. To me it doesn't make sense that you turn a magical age of 18 and suddenly your past poor decisions vanish into a void.
I think we can at least agree that children recruited by organized crime to steal cars, break into homes, assault people, and so on, should be treated differently than a kid who stole a pack of gum from a store. Whatever the solution is, it has to take into account the seriousness of the crime, and it has to discourage this binary criminal / not criminal thinking.
> Why are we protecting criminals, just because they are minors? Protect victims, not criminals.
Protect victims and criminals. Protect victims from the harm done to them by criminals, but also protect criminals from excessive, or, as one might say, cruel and unusual punishment. Just because someone has a criminal record doesn't mean that anything that is done to them is fair game. Society can, and should, decide on an appropriate extent of punishment, and not exceed that.
A thing can't simultaneously be public and not. There is no license to do research nor should there be, so if researchers can get it then anyone can.
If it's not supposed to be public then don't publish it. If it's supposed to be public then stop trying to restrict it.
Totally agree. And it goes beyond criminal history. Just because I choose to make a dataset publicly available doesn't mean I want some AI memorizing it and using it to generate profit.
Records of cases involving children are already excluded so that's not a relevant risk.
Between not delivering the data to AI companies, and barring it altogether is a fair distance. As far as I know, the MoJ is in talks with openAI themselves (https://www.ukauthority.com/articles/ministry-of-justice-rea...).
AI isn't the problem here. Once something goes on the internet it lives forever (or should be treated as such). So has it always been.
If something is expungable it probably shouldn't be public record. Otherwise it should be open and scrapable and ingested by both search engines and AI.
If you commit a crime, get caught, and that makes people trust you less in the future, that's just the natural consequences of your own actions.
No, I don't think if you shoplift as a teenager and get caught, charged, and convicted that automatically makes you a shoplifter for the rest of your life, but you also don't just get to wave a magic wand and make everyone forget you did what you did. You need to demonstrate you've changed and rebuild trust through your actions, and it's up to each individual person to decide whether they're convinced your trustworthy, not some government official with a delete button.
Can you explain your reasoning about “forever convictions”, and for full disclosure, do you have a conviction and are thereby biased?
Additionally, do you want a special class of privileged people, like a priestly class, who can interpret the data/bible for the peasantry? That mentality seems the same as that behind the old Latin bibles and Latin mass that people were abused to attend, even though they had no idea what was being said.
So who would you bequeath the privileges of doing “research”?Only the true believers who believe what you believe so you wouldn’t have to be confronted with contradictions?
And how would you prevent data exfiltration? Would you have your authorized “researchers” maybe go to a building, we can call it the Ministry of Truth, where they would have access to the information through telescreen terminals like how the DOJ is controlling the Epstein Files and then monitoring what the Congressmen were searching for? Think we would have discovered all we have discovered if only the Epstein people that rule the country had access to the Epstein files?
Yes, convictions are permanent records of one’s offenses against society, especially the egregious offenses we call felonies on the USA.
Should I as someone looking for a CFO or just an accountant not have the right that to know that someone was convicted of financial crimes, which is usually long precipitated by other transgressions and things like “mistakes” everyone knows weren’t mistakes? How would any professional association limit certification if that information is not accessible? So Madoff should Ave been able to get out and continue being involved in finances and investments?
Please explain
Names and other PII can be replaced with aliases in bulk data, unsealed after ID verification on specific requests and within quotas. It’s not a big problem.
>Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.
Is this the UK thing where PII is part of the released dataset? I know that Ukrainian rulings are all public, but the PII is redacted, so you can train your AI on largely anonymized rulings.
I think it should also be against GDPR to process sensitive PII like health records and criminal convictions without consent, but once it hits the public record, it's free to use.
Before you even get to the "AI dataset ... forever-conviction" or copyright issues, you need to address AI's propensity to hallucinate when fed legal data and questions.
- [deleted]
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
On the other hand, perpetrating crime is a GREAT predictor of perpetrating more crime -- in general most crime is perpetrated by past perps. Why should this info not be available to help others avoid troublemakers?
https://bjs.ojp.gov/library/publications/returning-prison-0
https://www.prisonpolicy.org/graphs/sex_offense_recidivism_2...
https://usafacts.org/articles/how-common-is-it-for-released-...
https://pmc.ncbi.nlm.nih.gov/articles/PMC3969807/
https://ciceroinstitute.org/research/the-case-for-incarcerat...
I know some countries that emit a "certificate of no judicial history", even when the citizen has so, if they ended the jail time
I think this is wrong, it should be reported entirely at least for 5 years after the fact happened
The jail time is the entire punishment. To allow punishment to continue afterward is to invite recidivism.
Jail time is not always the entire punishment, especially on the enlightened continent, where jail time is used sparingly. Keeping the conviction on record is a thing, because consecutive convictions often carry come with higher punishment. So depending on a crime cathegory, there is X years after which the record no longer counts, but it's not 0.
No, public doesn't mean access should be limited to academics of acceptable political alignment, it means open to the public: everybody.
That is the entire point of having courts, since the time of Hammurabi. Otherwise it's back to the clan system, where justice is made by avenging blood.
Making and using any "profiles" of people is an entirely different thing than having court rulings accessible to the public.
Exactly, public means to the _public_, just because it's persons of the public doesn't mean that corporations are entitled to be able to use or profit from that material.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
The idea that society is required to forget crime is pretty toxic honestly.
Society does a poor job of assessing the degree of crime. It's too binary for people: You're either a criminal or not. There are too many employers who would look at a 40 year old sitting in front of them applying for a job, search his criminal record, find he stole a candy bar when he was 15, and declare him to be "a criminal" ineligible for employment.
A less incompetent employer would look at the conviction, realize if was for stealing a candy bar 25 years ago, and decide it doesn't matter.
Though if the details of the case were not public or hard to access they might assume it was worse than it was. (Realistically no child would get prosecuted for stealing a candy bar one time, but I'll grant maybe there are other convictions that sound worse without context.) Maybe the problem is actually that the data is not accessible enough, rather than too accessible?
The idea that society is required to forgive crime is pretty Christian, though.
that part of Christianity somehow is lost on Americans somehow.
[dead]
Im sorry but that's the equivalent of "I believe in free speech but not the right to hate speech". Its either free or not
Actually it isn't the same.
We can allow access to private persons while disallowing commercial usage and forbid data processing of private information (outside of law enforcement access).
Kinda like it was in pre-digital days, no we can't go back but we can at least _try_ to make PII information safeguarded.
Most EU countries have digital ID's, restricting and logging (for a limited time) all access to records to prevent mis-use. Anyone caught trying to scrape can be restricted without limiting people from accessing or searching for specific _records or persons of interest_ (seriously, would anyone have time to read more than a couple of records each day?).
I believe it would be more accurate to say: "I believe in free speech but only from accredited researchers. Oh btw the government can also make laws to control such accreditation"
The story is about a tool that allows journalists to get advanced warning of court proceedings so them can choose to cover things of public interest.
It's not about any post-case information.
We should remember that local journalism has been dead for a decade in most of the UK, largely due to social media.
Any tool like this that can help important stories be told, by improving journalist access to data and making the process more efficient, must be a good thing.
Then it would be even worse if this ends up affecting post-case information.
>We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
How about rate limited?
No. Open is open. Beyond DDoS protections, there should be no limits.
If load on the server is a concern, make the whole database available as a torrent. People who run scrapers tend to prefer that anyway.
This isn't someone's hobby project run from a $5 VPS - they can afford to serve 10k qps of readonly data if needed, and it would cost far less than the salary of 1 staff member.
> Open is open.
I’d then ask OpenAI to be open too since open is open.
You're talking about a tragedy of the commons situation. There is an organic query rate of this based on the amount of public interest. Then there is the inorganic vacuuming of the entire dataset by someone who wants to exploit public services for private profit. There is zero reason why the public should socialize the cost of serving the excess capacity caused by private parties looking to profit from the public data.
I could have my mind changed if the public policy is that any public data ingested into an AI system makes that AI system permanently free to use at any degree of load. If a company thinks that they should be able to put any load they want on public services for free, they should be willing to provide public services at any load for free.
Rate limiting is a DDoS protection.
Pedantically: rate limiting is DoS prevention, not DDoS prevention. If you rate limit per IP, you're not mounting effective protection against a distributed attack. If you're rate limiting globally, you're taking your service offline for everyone.
The world is not black and white.
The issue with that is people can then flood everything with huge piles of documents, which is bad enough if it's all clean OCR'd digital data that you can quickly download in its entirety, but if you're stuck having to wait between downloading documents, you'll never find out what they don't want you to find out.
It's like having you search through sand, it's bad enough while you can use a sift, but then they tell you that you can only use your bare hands, and your search efforts are made useless.
This is not a new tactic btw and pretty relevant to recent events...
Systems running core government functions should be set up to be able to efficiently execute their functions at scale, so I'd say it should only restrict extreme load, ie DoS attacks
If the rate limit is reasonable (allows full download of the entire set of data within a feasible time-frame), that could be acceptable. Otherwise, no.
> Something is either public record - in which case it should be on a government website for free, and the AI companies should be free to scrape to their hearts desire...Or it should be sealed for X years and then public record.
OR it should be allowed for humans to access the public record but charge fees for scrapers
I don't know what the particular issue is in this case but I've read about what happens with Freedom of Information (FOI) requests in England: apparently most of the requests are from male journalists/writers looking for salacious details of sex crimes against women, and the authorities are constantly using the mental health of family members as an argument for refusing to disclose material. Obviously there are also a few journalists using the FOI system to investigate serious political matters such as human rights and one wouldn't want those serious investigations to be hampered but there is a big problem with (what most people would call) abuse of the system. There _might_ perhaps be a similar issue with this court reporting database.
England has a genuinely independent judiciary. Judges and court staff do not usually attempt to hide from journalists stuff that journalists ought to be investigating. On the other hand, if it's something like an inquest into the death of a well-known person which would only attract the worst kind of journalist they sometimes do quite a good job of scheduling the "public" hearing in such a way that only family members find out about it in time.
A world government could perhaps make lots of legal records public while making it illegal for journalists to use that material for entertainment purpose but we don't have a world government: if the authorities in one country were to provide easy access to all the details of every rape and murder in that country then so-called "tech" companies in another country would use that data for entertainment purposes. I'm not sure what to do about that, apart, obviously, from establishing a world government (which arguably we need anyway in order to handle pollution and other things that are a "tragedy of the commons" but I don't see it happening any time soon).
Without numbers this sounds made up
I should clarify that I was talking about the FOI requests submitted to a particular authority: I think it was the National Archives or some subsection thereof. If you're talking about all FOI requests submitted to all authorities then probably most of them don't relate in any way to criminal cases. I think we don't really need precise numbers to observe that public access to judicial data can be abused, which is all I wanted to say, really. I wrote too many words.
One of the problems with open access to these government DBs is that it gives out a lot of information that spammers and scammers use.
Eg if you create a business then that email address/phone number is going to get phished and spammed to hell and back again. It's all because the government makes that info freely accessible online. You could be a one man self-employed business and the moment you register you get inundated with spam.
Spoken like someone who's never spent thousands of dollars and literal years struggling to get online records corrected to reflect an expungement. Fuck anything that makes that process even more difficult which AI companies certainly will.
I want information to be free.
I don't think all information should be easily accessible.
Some information should be in libraries, held for the public to access, but have that access recorded.
If a group of people (citizens of a country) have data stored, they ought to be able to access it, but others maybe should pay a fee.
There is data in "public records" that should be very hard to access, such as evidence of a court case involving the abuse of minors that really shouldn't be public, but we also need to ensure that secrets are not kept to protect wrongdoing by those in government or in power.
Totally agreed! This is yet another example of reduced friction due to improved technology breaking a previously functional system without really changing the qualities it had before. I don't understand why this isn't obvious to more people. It's been said that "quantity has a quality all its own", and this is even more true when that quantity approaches infinity.
Yes, license plates are public, and yes, a police officer could have watched to see whether or not a suspect vehicle went past. No, that does not mean that it's the same thing to put up ALPRs and monitor the travel activity of every car in the country. Yes, court records should be public, no, that doesn't mean an automatic process is the same as a human process.
I don't want to just default to the idea that the way society was organized when I was a young person is the way it should be organized forever, but the capacity for access and analysis when various laws were passed and rights were agreed upon are completely different from the capacity for access and analysis with a computer.
Yes. This should be held by the London Archives in theory with the rest of the paper records of that sort.
They have ability to seal documents until set dates and deal with digital archival and retrieval.
I suspect some of this is it's a complete shit show and they want to bury it quickly or avoid having to pay up for an expensive vendor migration.
I think the right balance is to air gap a database and allow access to the public by your standard: show up somewhere with a USB.
I think it's right to prevent random drive by scraping by bots/AI/scammers. But it shouldnt inhibit consumers who want to use it to do their civic duties.
The idea that an individual can look up and case they want is the same thing as a bot being able to scrape and archive an entire dataset forever is just silly.
One individual could spend their entire life going through one by one recording cases and never get through the whole dataset. A bot farm could sift through it in an hour. They are not the same thing.
>and the AI companies should be free to scrape to their hearts desire...
Why? They generate massive traffic, why should they get access for free?
> Nothing that goes through the courts should be sealed forever.
What about family law?
Yep, these are commonly sealed records and having worked with family law lawyers there are things that happened to the victims that should never be unsealed.
Even outside of family law there are many justifiable reasons for sealing and even expunging (deletion) of records. I’m a believer that under the correct circumstances criminal records should be sealed & even in some cases expunged as well. People deserve a second chance.
Family law is just the most obvious and unarguable example.
[dead]
This is a good use case for a blockchain. AI companies can run their own nodes so they're not bashing infra that they don't pay for. Concerned citizens can run their own nodes so they know that the government isn't involved in any 1984-type shenanigans. In the sealed-for-X-years case, the government can publish a hash of the blocks that they intend to publish in X years so that when the time comes, people can prove that nobody tampered with the data in the interim.
The government can decide to stop paying for the infra, but the only way to delete something that was once public record should be for all interested parties to also stop their nodes.
I like the part where you created a system where if someone has enough resources they can just alter the judicial record.
It seems like you're assuming we'd use proof-of-work. That would be crazy.
The consensus mechanism would be: block is good if it has a judge's signature (or some other combination of signatures from other elected officials, depending on how the laws work where you are).
Or are you proposing that somebody out there is prepared to subvert the signatures by computing a hash collision or some other herculean task?