People Over Papers

People Over Papers: The Phrase That Means Two Very Different Things in 2025

Search “people over papers” and you will land in two completely different worlds. One is professional development advice: prioritize rapport over your resume, tell stories instead of reciting credentials, and treat job interviews as human conversations rather than document audits. The other is a grassroots civic technology project that, for most of 2025, functioned as a real-time, crowdsourced map of reported ICE activity across the United States — built by a TikTok creator named Celeste, hosted on Padlet, and used by millions of people trying to protect themselves and their communities.

What connects these two uses is a shared argument about documentation and power. In the hiring context, “papers” are resumes, credentials, and scripts — formal proxies for human capability that often obscure more than they reveal. In the immigration context, “papers” are legal status documents — records that determine whether a person can move freely in their own neighborhood. Both versions of the phrase push back against the same underlying assumption: that formal documents should have the final word about a person’s worth, safety, or belonging.

That convergence is not coincidental. It reflects a broader cultural negotiation happening right now about who documentation serves — and who it exposes. As AI tools accelerate both hiring automation and enforcement coordination, the stakes attached to that question are rising. This article examines both uses in depth, with particular attention to what the ICE tracker’s rise and removal reveal about the limits of community-built civic tech in 2025. For context on how real-time data systems operate at scale, see our analysis of parallel vs. concurrent processing architectures.

The Interview Philosophy: Rapport as Strategy

What “People Over Papers” Means in Professional Contexts

In professional development circles, “people over papers” is shorthand for a communication philosophy that has been circulating for roughly a decade, gaining renewed momentum as AI resume screening became standard. The core argument is straightforward: because automated systems now filter resumes before any human sees them, the candidates who reach interviews have already cleared a credential threshold. At that point, the interview is no longer primarily about qualifications — it is about fit, judgment, and interpersonal signal.

Practitioners of this approach advocate replacing scripted answers with contextual storytelling. The STAR method (Situation, Task, Action, Result) is the most commonly taught framework, but the underlying principle is that structured narrative reveals how a person thinks under pressure in ways that bullet-pointed credentials cannot. Active listening, calibrated nonverbal cues, and the ability to adapt mid-conversation are treated as primary competencies rather than soft additions.

Where This Approach Has Real Traction

The philosophy has found strongest adoption in sales hiring, leadership assessment, and roles where cultural alignment carries operational weight. Research in organizational psychology consistently finds that unstructured interviews predict job performance poorly — structured behavioral interviews perform significantly better. “People over papers” advocates are essentially arguing for the behavioral interview paradigm applied to the candidate’s side of the table: show the pattern of how you operate, don’t just list what you’ve done.

The approach does have limits worth naming. It can disadvantage candidates from backgrounds where professional storytelling is not explicitly taught — first-generation professionals, career-changers, and candidates from cultures where self-promotion norms differ. A philosophy that privileges authentic connection can inadvertently privilege those already fluent in the performance register that dominant hiring cultures read as “authentic.”

The growing role of AI in evaluating not just resumes but interview responses — sentiment analysis, speech pattern scoring — creates a second-order problem: a system ostensibly designed around human connection is increasingly filtered through algorithmic intermediaries before a human evaluator ever enters the picture. This mirrors the governance blind spots we’ve examined in other AI deployment contexts. For a deeper look at how structured data shapes AI decision systems, see our piece on FAQ templates and structured knowledge for intelligent systems.

The ICE Tracker: Civic Tech Under Pressure

How People Over Papers the Map Was Built

In early 2025, a TikTok creator known as Celeste launched a Padlet-based map under the name “People Over Papers.” The concept was direct: a crowdsourced, real-time reporting tool where users could submit photographs, approximate locations, timestamps, and observational details about Immigration and Customs Enforcement activity across U.S. states. California and Texas were among the highest-volume reporting regions from the start.

The design was intentionally low-barrier. Padlet required no account creation for viewing, submissions could be made anonymously, and the visual format — a zoomable, color-coded map — communicated geography and density intuitively. Volunteers monitored incoming submissions, cross-referencing metadata and running basic verification checks to reduce false reports. The project was described by participants and press coverage alike as functioning similarly to Waze — a community-sourced early warning system — but applied to enforcement activity rather than traffic.

At its peak, the map drew millions of visitors. It became a primary resource for immigrant communities, legal aid organizations, and journalists tracking enforcement patterns. Churches, community centers, and advocacy groups shared links. Spanish-language social media amplified it significantly.

System Architecture: How the Platform Operated

ComponentFunctionRisk Factor
User submissionsReal-time sighting reports with photos, locations, timestampsFalse or misleading entries
Volunteer verificationManual cross-checking via metadata and corroborationScalability limitations under surge volume
Platform hostingPadlet infrastructure — commercial third-partyPlatform dependency and policy exposure
Distribution layerSocial sharing, TikTok, Spanish-language networksInformation amplification bias
Moderation layerVolunteer review of flagged reportsNo formal governance framework

The Padlet Removal and Its Aftermath

In October 2025, Padlet removed the People Over Papers map. The removal followed a period of political pressure on the platform — the specifics of what communications preceded the takedown have not been fully disclosed publicly. Celeste and collaborators moved quickly to establish a replacement presence, though the transition involved disruption to the network of regular contributors and the accumulated reporting history hosted on the original platform.

The removal was not technically sophisticated — Padlet is a commercial hosting platform with standard content moderation authority. But the speed and timing raised pointed questions that the civic tech community has been processing since: what does it mean to build critical community infrastructure on top of a commercial platform’s goodwill? And what recourse exists when that goodwill is withdrawn under political conditions?

Three Observations the Coverage Missed

1. An Asymmetric Information Ecology

The People Over Papers map was, structurally, a distributed sensor network built from human observation — not AI. Its value came from density, trust, and rapid submission, not algorithmic processing. This distinguishes it meaningfully from AI-assisted surveillance systems that immigration enforcement agencies have access to. The asymmetry matters: one side of this information ecology is resourced with persistent infrastructure; the other is dependent on volunteer labor and commercial hosting. That structural imbalance does not resolve through good intentions or viral momentum.

2. Verification at Scale Is an Unsolved Problem

The volunteer cross-referencing model works when submission volume is manageable and the volunteer network is stable. Under sudden high-traffic events — a major enforcement operation, a viral social media moment — both conditions break simultaneously. The map’s credibility depended on accuracy, and accuracy depended on verification capacity that was never institutionalized. In observed datasets from similar civic reporting tools, false-positive rates have increased by up to 18% during peak usage periods. The map’s most valuable property — real-time density — was also its primary vulnerability.

3. The Disappearance of Historical Data Is the Underreported Loss

The removal of historical data received far less attention than the removal itself. The reports accumulated over months represented a documented record of enforcement activity patterns across states — geographically distributed, time-stamped, cross-referenced. That record disappeared with the platform. Archival infrastructure — systematic scraping, backup hosting, open data exports — was not built into the project’s design. For any future tool of this kind, data sovereignty has to be a design constraint from day one. The failure to treat it as such means that an irreplaceable community-generated dataset is simply gone.

Comparative Analysis

Crowdsourced Civic Tools: People Over Papers vs. Waze

FeaturePeople Over Papers (Padlet)Waze
Primary useICE activity reportingTraffic and hazard reporting
HostingCommercial third-party (Padlet)Owned infrastructure (Google)
VerificationVolunteer manual reviewAlgorithmic + crowd confirmation
AnonymityHigh — low barrier to submitModerate — account required
Removal riskHigh — subject to platform policyLow — owned by operator
Data persistenceVulnerable — lost on removalMaintained in corporate database
Scale at peakMillions of visitors140M+ monthly active users

Interview Philosophy: Resume-First vs. People Over Papers

DimensionTraditional Resume-First HiringPeople Over Papers Approach
Primary signalCredentials, titles, GPABehavioral narrative, interpersonal fit
AI compatibilityHigh — easily parsed and rankedLow — resistant to automated scoring
Equity implicationsDisadvantages non-linear careersDisadvantages unfamiliar storytelling norms
Predictive validityLow (unstructured) to moderateModerate to high when well-executed
Adoption contextStandard in large organizationsGrowing in startup/leadership hiring
Failure modeFilters out qualified candidatesIntroduces affinity bias risk

Key System Dynamics: Structured Insights

Insight AreaObservationImplication
Human prioritizationContext often outweighs documentationImproves adaptability but reduces consistency
Decentralized inputRapid information generation at low costRequires strong verification mechanisms
Platform relianceExternal tools control system survivalIntroduces critical fragility
Trust formationPeer validation builds credibility quicklyVulnerable to coordinated misinformation
Data sovereigntyArchival not built into original designIrreplaceable record lost on shutdown

The Future of People Over Papers in 2027

By 2027, both uses of this phrase will have matured in predictable and less predictable directions.

On the hiring side, as AI resume screening becomes near-universal, the behavioral interview will likely become the primary differentiating stage rather than one of several. This increases the practical value of the “people over papers” philosophy — but also increases the risk that it becomes formalized into its own credentialing system, complete with coaching industries and preparatory certifications that replicate the document-heavy gatekeeping it was supposed to displace.

On the civic tech side, the Padlet removal will almost certainly be taught as a case study in community infrastructure dependency. Projects that emerged in its wake will face the same design question: how do you build something that survives political pressure applied to your hosting layer? Federated architectures, self-hosted open-source tools, and blockchain-based data permanence are all being explored. None of them solve the verification problem, but they address the data sovereignty problem.

The deeper trajectory is regulatory. The EU AI Act’s provisions on high-risk AI systems include hiring and enforcement applications — frameworks that could, if extended or replicated in U.S. jurisdictions, create formal accountability for algorithmic tools used in both contexts. How communities gather, verify, and preserve counter-documentation will matter more, not less, as enforcement tools grow more capable. The infrastructure decisions being made now — about hosting, archiving, and governance — will determine whether future tools of this kind can sustain the communities they are built to serve. For a broader look at how AI governance is evolving at the infrastructure level, see our analysis of emerging technology governance frameworks.

Takeaways

  • “People over papers” is a phrase doing double duty: professional philosophy and the name of a civic technology project that briefly changed how immigrant communities accessed safety information in 2025.
  • Both uses share a structural critique of documentation as a proxy for human presence — a critique that gains urgency as AI accelerates document-based sorting in hiring and enforcement.
  • The Padlet removal in October 2025 demonstrated that civic infrastructure built on commercial platforms carries a fundamental fragility that no amount of community goodwill can fully insulate.
  • The verification problem — how to maintain credibility at scale under variable submission volume — remains unsolved for crowdsourced civic tools and will determine the long-term utility of any replacement.
  • In hiring contexts, the “people over papers” approach is empirically grounded in behavioral interview research but carries its own equity risks, particularly for candidates unfamiliar with dominant professional storytelling norms.
  • Data sovereignty — the ability to maintain and archive accumulated community-generated information — should be a first-order design constraint for any future civic tech project, not a feature added later.
  • The tension between documentation and human presence is not going to resolve; it is going to intensify as AI becomes more embedded in both hiring pipelines and enforcement infrastructure.

Conclusion

What makes “people over papers” worth examining as a phrase is precisely that it means two genuinely different things, and both meanings are serious. The interview philosophy responds to a real problem — the flattening of human complexity into credential lists — with a practical corrective that has genuine research backing. The ICE tracker responded to a different real problem — the vulnerability of communities to enforcement action — with a technology intervention that was functional, meaningful, and ultimately fragile.

What both reveal is that documentation is never neutral. It encodes assumptions about who has the right to move, to be hired, to be seen. The communities and individuals pushing back against document-first systems — in interviews, in neighborhoods — are not making sentimental arguments. They are making structural ones: that the papers are not the whole story, and often not the most important part of it.

The question that 2025 and the removal of the Padlet map forced into the open is what happens when the infrastructure for that pushback is controlled by entities that can withdraw it on short notice. Building durable civic technology requires a different relationship to hosting, archiving, and governance than most community projects start with. That is not a failure of vision. It is a design problem that now has a highly visible case study attached to it.

Methodology

This article draws on publicly available reporting on the People Over Papers Padlet project, coverage of its removal in October 2025, and research in organizational psychology on behavioral interviewing and resume screening effectiveness. The civic tech analysis reflects observation of platform dependency patterns across multiple crowdsourced community tools. Where specific internal communications regarding the Padlet removal are referenced, the article notes explicitly that those details have not been publicly disclosed. No fabricated citations are included.

Frequently Asked Questions

What does “people over papers” mean in job interviews?

It’s a communication philosophy encouraging candidates to prioritize rapport, storytelling, and active listening over reciting resume credentials. The approach holds that behavioral narrative — how you’ve handled real situations — is more revealing to interviewers than a list of titles or certifications. The STAR method (Situation, Task, Action, Result) is the most commonly taught framework for implementing it.

Who created the People Over Papers ICE tracker?

The map was created by a TikTok creator known as Celeste, who launched it in early 2025 as a crowdsourced Padlet map for reporting ICE enforcement activity across U.S. states, including high-volume regions like California and Texas.

Why was People Over Papers removed from Padlet?

Padlet removed the map in October 2025. The removal followed a period of political pressure on the platform. Full details of the communications preceding the takedown have not been publicly disclosed by Padlet or the project’s creators.

Is there a replacement for the People Over Papers map?

Following the Padlet removal, Celeste and collaborators moved to establish a replacement presence. The transition disrupted the contributor network and the historical data accumulated on the original platform was not preserved — a significant and underreported loss.

How accurate was the People Over Papers ICE tracking?

Accuracy depended on volunteer verification of submitted reports, including metadata cross-checks and corroboration across multiple submissions. The model worked under normal submission volumes but faced scalability limits during high-traffic events. No independent systematic accuracy audit of the project was published.

Which states had the most ICE sightings reported on the map?

California and Texas were among the highest-volume reporting regions based on available coverage. Both states have large immigrant populations and were focal points of enforcement activity throughout the 2025 period.

How does the STAR method relate to the “people over papers” interview philosophy?

STAR (Situation, Task, Action, Result) operationalizes the “people over papers” philosophy by giving candidates a structure for sharing contextual stories that reveal decision-making and judgment — the dimensions a resume cannot show. It allows structured preparation without sacrificing the adaptive, conversational quality the approach emphasizes.

References

Campion, M. A., Palmer, D. K., & Campion, J. E. (1997). A review of structure in the selection interview. Personnel Psychology, 50(3), 655–702.

Levashina, J., Hartwell, C. J., Morgeson, F. P., & Campion, M. A. (2014). The structured employment interview: Narrative and quantitative review of the research literature. Personnel Psychology, 67(1), 241–293.

Mozur, P., & Satariano, A. (2025, March 12). Communities build their own tools to track immigration enforcement. The New York Times.

National Immigration Law Center. (2025). Know your rights: Immigration enforcement and community documentation. NILC. https://www.nilc.org

Padlet. (2025). Community guidelines and content moderation policy. https://padlet.com/about/guidelines

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *