Laila Lalami’s disturbing new novel, The Dream Hotel, raises an existential question: How to read dystopian fiction when what has been imagined is now a familiar, quotidian reality? Lalami’s protagonist, a thirty-eight-year-old American-born academic and archivist at the Getty Museum named Sara Hussein, is pulled aside by functionaries from the government’s Risk Assessment Administration (RAA) when she arrives at Los Angeles International Airport after attending a conference in London. An algorithm has flagged her as “an imminent risk” for a crime she has not committed, based in part on her dreams. Sara is then remanded to a “retention center” for a “forensic observation” until she lowers her “risk score”—an inscrutably derived number that supposedly represents the likelihood of committing a crime in the future. A long-standing piece of legislation, the Crime Prevention Act, gives authorities the power to incarcerate anyone they deem potentially dangerous. A majority of Americans—62 percent—support this “pre-crime” law, because it promises to prevent misconduct before it happens.
That is the pretext for the fictional world Lalami has created. Perhaps by chance—or perhaps she herself has an uncanny ability to anticipate crimes before they have happened—real life caught up to her imagination. In March—the month Lalami’s book was published—Rümeysa Öztürk, a Turkish doctoral student at Tufts University, was abducted by a group of masked men as she walked down the street in Somerville, Massachusetts.* The men, it turned out, were agents from Immigration and Customs Enforcement (ICE). After being hustled into an unmarked car, Öztürk was shuttled between a number of government lockups, landing finally at a detention center in Louisiana, where she sat for six weeks, charged with no crime.
Rather like Sara Hussein, Rümeysa Öztürk was singled out by the government for her thoughts: she had coauthored a March 2024 opinion piece in the Tufts student newspaper decrying Israel’s military operation in Gaza. The government justified holding Öztürk by claiming that she was “involved in associations that ‘may undermine US foreign policy by creating a hostile environment for Jewish students and indicating support for a designated terrorist organization.’” (Note that the government was not making the case that Öztürk herself was a member of those organizations.) When she was finally released—though she still faces deportation—US District Court Judge William Sessions III stated that “there is absolutely no evidence that she has engaged in violence or advocated violence.”
From the start of her career, Lalami has been a nuanced and discerning chronicler of the immigrant experience and of those who exist at the margins of society. Her nonfiction book Conditional Citizens is a personal exploration of what it means to be an American. Her novel The Other Americans was a finalist for the 2019 National Book Award in fiction.
The Dream Hotel is different. In it it’s the law, randomly applied—not descent or status—that marginalizes people. Once Sara is in the clutches of the RAA, she is sent to Madison, an elementary school that has been repurposed as a holding facility run by a private company called Safe-X, for a mandatory three-week observation period. But do not call it a prison, and do not call the women at Madison prisoners. As Lalami describes it, “The retainees have to eat what they’re given, do what they’re told, sleep when the lights are out, but they’re considered FUO. Free, under observation.”
Much of the novel is given over to Sara’s day-to-day life at Madison—the friendships and alliances among the women, Sara’s longing to return to “real” life, visits from her husband and young children, her fear and frustration as those visits diminish, the cruelties and indignities exacted by the guards, and her estrangement from life as she has known it and expected it to be. Lalami is a subtle polemicist, which is to say that she doles out Sara’s pain and depredation by degrees so that it never feels contrived or sensational. Over time Sara’s trajectory begins to resemble the first four of Kübler-Ross’s stages of grief: denial, anger, bargaining, and depression. “What has retention taught her these last few months?” Lalami writes.
That the whole world can shrink to a room.
That time is the god of all things.
That the rules don’t have to make sense.
That no matter how unjust the system is, she is expected to submit to it in order to prove that she deserves to be free of its control.
Still, the challenge for Sara is to resist arriving at the fifth stage—acceptance—and becoming a docile cog in the grinding machinery of the state or the (imminently violent) person the algorithm says she is. Sara gets written up by the guards for a slew of seemingly random, insignificant infractions—having a “noncompliant hairstyle” and loitering in a hallway among them—that result in a forty-five-day extension of her sentence. She does not know it, not at first, but Safe-X makes most of its money from these extensions; the guards—Safe-X employees—have a financial incentive to keep people locked up. In Sara’s case, the write-ups continue to accumulate to the point where she is still being held nearly a year later. As her lawyer explains, “The RAA had full authority to keep her in custody because the courts defined retention as precaution, not punishment.” In words that are especially prescient now, Lalami writes:
That they have committed no crime is beside the point. In any case crime is relative, its boundaries shifting in service of the people in power…. The law separates the permitted from the forbidden, but it doesn’t require that a crime be committed before the agents acting in its name deploy the full force of their power.
Despite entreaties from her lawyer, the RAA refuses to explain the mechanics of the algorithm responsible for Sara’s incarceration—what data it uses and how that data is weighted—arguing that, because it is the creation of a private company, it is proprietary intellectual property. Sara has no way of knowing, for instance, how her Muslim religion or her Arabic-sounding surname figures into it. The agency already has access to data from her pre-retention life that has been stored in the aptly named OmniCloud; these bits and pieces may have seemed inconsequential at the time, but in aggregate the algorithm and the RAA may understand them differently. She knows that the authorities have combed through her social media posts and must have seen that she once had her account suspended after a heated exchange with a former neighbor who had been peddling conspiracy theories. In retribution he reported her for posting nude archival photographs. “But everyone had stories like this, didn’t they?” she thinks.
At the airport an RAA officer named Segura asks her to hand over her phone, which she does, reluctantly:
She didn’t know what he was looking for and she was fairly sure he didn’t either, so the whole procedure left her feeling deeply violated, as if a stranger were standing at the window of her bedroom, looking in. It occurred to her that what Segura was checking wasn’t so much her phone, but her compliance with his directive to hand it over.
The authorities also have Sara’s DNA on file, most likely recouped from tests administered to school-age children so they could be easily identified in the event of a school shooting or a natural disaster. Surveillance is everywhere. Sara has become inured to it.
And then there is the matter of her dreams. After giving birth to twins, Sara had a device implanted at the base of her brain to help ease her insomnia. At the time she assumed that the device—though it is called the Dreamsaver—simply recorded when she fell asleep and when she woke up. In fact, it has also been monitoring her dreams, which are fed into the algorithm every day.
Sara, of course, cannot control her dreams; the RAA, though, can use them as evidence against her. When she complains to Officer Segura that dreams are not real, he tells her that whether they are real or not is a philosophical question above his pay grade and that her dreams are just one of the variables assessed by the algorithm. While the consent form Sara signed before the Dreamsaver surgery, and which she can barely remember, promised that she would retain the intellectual property rights to this “content,” it also stated that the company would never share private data with third parties—except when “a legal enforcement authority” requested it. Thanks to the Crime Prevention Act, those authorities have automatic access to a wide array of private, personal information, typically without an individual’s knowledge.
By her own account, Lalami began writing The Dream Hotel in 2014. This was a year after Edward Snowden’s breach of the National Security Agency revealed, among other things, that the United States government was spying on its own citizens, reading e-mails and social media posts and listening to phone calls. And it wasn’t just the government that was eavesdropping. By then nearly two billion people globally carried smartphones, including around 160 million Americans, which gave companies such as Google the ability to follow the activities of users across millions of websites. While almost everyone surveyed by the Pew Research Center that year indicated that consumers had “lost control over how personal information is collected and used by companies,” more than half also said that they were willing to share personal information in order to access sites for free.
This personal information is the fuel that powers the largely unregulated data brokerage industry; a report issued by the Federal Trade Commission at the time noted that data brokers, who were quietly amassing detailed behavioral profiles on just about everyone, operated without transparency or accountability. What the report failed to mention, though, was that the government itself buys this personal behavioral data, which allows it to bypass the Fourth Amendment’s prohibition on warrantless searches. A report in The Wall Street Journal from Trump’s first term found that his administration “has bought access to a commercial database that maps the movements of millions of cellphones in America and is using it for immigration and border enforcement.” Four months into his second term, Trump canceled the Biden administration’s proposed rules curtailing data brokers’ practice of selling Americans’ personal information.
By 2014, too, a significant number of police departments across the country were using software that promised to predict where crimes would occur. Three years earlier Time magazine had deemed predictive policing one of the year’s “50 Best Inventions.” The rationale behind predictive policing was straightforward: if an algorithm could determine where and when crimes were likely to occur, police departments could deploy their resources more effectively. And in a self-fulfilling way, it worked: saturating a neighborhood with more police resulted in more arrests, and more arrests resulted in more policing.
Algorithmic systems were supposed to be neutral and value-free, but that turned out not to be true. Not surprisingly, those arrests disproportionately affected people of color. Racism was baked into the formula. A report from the Electronic Frontier Foundation also found that some predictive systems were being used to target specific people. In one Florida jurisdiction, “police assigned a point system to individuals based on factors like a family’s income bracket or grades in order to assign a level of likelihood a child is to commit crimes—and then harass them and their family.”
Predictive policing is just one tool municipalities have employed to anticipate and (in theory) prevent future crimes. COMPAS, a “risk assessment” algorithm used in many jurisdictions to determine who is eligible for parole, is based on a slew of variables, including crime rates in the neighborhood to which a potential parolee would be returning. (Predictive policing increases the crime rates in those neighborhoods; COMPAS then denies parole.) Palantir, the secretive data analytics software company, patented its domestic crime forecasting system in 2014, though it was used two years earlier in New Orleans and was also sold to the US government to track down terrorists. ICE is now paying Palantir $30 million to develop ImmigrationOS, a software package that will help the government determine whom to deport. It is expected to have a prototype ready by September. And in May The New York Times reported that the Trump administration also appears to be using Palantir to amass a database on the population at large, which according to the report “could give him untold surveillance power.”
By the time Lalami returned to The Dream Hotel after putting it aside for several years, sleep trackers, license plate readers, doorbell cameras, geofencing, GPS, Bluetooth beacons, and facial recognition had become commonplace. Her protagonist, like most of us, has so thoroughly surrendered her privacy that its absence hardly registers. “Entire generations have never known life without surveillance,” Sara thinks.
Watched from the womb to the grave, they take corporate ownership of their personal data to be a fact of life, as natural as leaves growing on trees. Detaining someone because of their dreams doesn’t exactly trouble Americans; most of them think that the RAA’s methods are necessary.
It is rare for a dystopian novel to be overtaken and then surpassed by the events of the day, but that is where we landed after Trump won the 2024 election. His executive order “Protecting the American People Against Invasion,” issued on his first day in office, instructed the Department of Homeland Security “to ensure the efficient and expedited removal of aliens from the United States.” It has resulted in the detention of countless people, including those such as Öztürk who are in the country legally. Like Sara Hussein, most of these detainees are being held in privately run prisons. (The GEO Group, which oversees the Louisiana facility where Öztürk was held, was a major contributor to Trump’s campaign.) Unlike Sara, most are not well-resourced American citizens—though that may change. According to the investigative reporter Julia Angwin, writing in The New York Times,
President Trump could soon have the tools to satisfy his many grievances by swiftly locating compromising information about his political opponents or anyone who simply annoys him.
Meanwhile the euphemistically named Tools for Humanity, a company backed by OpenAI’s Sam Altman, a member of Trump’s inner circle, will soon be releasing a biometric device called the Orb that scans a person’s iris, creates a personal ID verifying that they are a real human, issues them a cryptocurrency called Worldcoin, and stores the information on the blockchain. The Orb’s questionable privacy implications, and its dependence on cryptocurrency, have already caused it to be banned, restricted, or placed under investigation in Brazil, Spain, South Korea, Indonesia, Kenya, and France. After the Trump administration rescinded Biden-era cryptocurrency regulations, the way was cleared for the Orb to be launched in the US. “Sometimes something that looks dystopian is, in fact, dystopian,” the Georgetown law professor Emily Tucker and the philosopher David McNeill point out in an article on the website Tech Policy Press.
Lalami is a canny, powerful, humane writer. She has created a fictional world that we now understand implicitly and a protagonist who could be any one of us. Sara Hussein assumes she will gain her freedom by following the rules. But her sentence keeps spooling outward; following the rules has not brought her any closer to being released. As Lalami writes,
It’s what she tried to do for months—minding her business, following every rule, trying to appear more compliant, more deserving of freedom than others stuck in this place. But what did that achieve, in the end? She’s still stuck here, still marked as a QUESTIONABLE. And in the meantime she’s been depleted of her dignity.
Eventually it occurs to Sara that the only way to regain her autonomy, even if it is sure to keep her locked up longer, is to flout the rules and persuade other women to do so, too. Self-determination, to whatever degree is possible under the circumstances, is a kind of freedom, if only because it restores one’s agency. It may be the only freedom afforded to people who are incarcerated.
Even so, it’s not exactly clear what Sara is after when she decides to quit the jobs to which she’s been assigned at Madison: working in the laundry and training AI software for a private company contracted with Safe-X by watching artificially generated movie clips and answering the question “IS THIS REAL?” As she tells one of the guards,
There’s no rule that says I have to work. This isn’t a prison, remember? I haven’t been convicted of a crime, and I can’t be compelled to work by any correctional entity, whether public or private.
Yet as Sara learns early on, refusing a job assignment means that the algorithm will mark her as “unemployed” and negatively affect her risk score. In other words, working doesn’t necessarily lower it, but not working raises it.
Emboldened by her own nerve and the revelation that a fellow retainee was actually a researcher from the Dreamsaver company, Sara begins to organize the other retainees. Striking will surely lengthen their sentences, but it also holds the promise of bringing down the whole enterprise. Without their labor, who will serve the food, do the wash, and fulfill Safe-X’s lucrative contracts with outside businesses that rely on cheap grunt work?
If collective action is meant to benefit the common good, then Sara’s effort is an abject failure. While it upsets the RAA, it doesn’t cripple or destroy the system that has incriminated Sara and the others for something they haven’t done. It does, however, serve Sara well personally: with surprising swiftness, as the strike begins to take hold, the authorities release her. She and another retainee, her main partner in this endeavor, get to walk away, leaving behind the women they’ve convinced to follow their lead. Sara is freed from Madison, but she now knows that in a techno-surveillance state, where fear is the engine of social control, no one is safe from its malign reach.
On first reading, this seems too easy—in real life, troublemakers are rarely rewarded, and successful resistance takes a sustained, persistent effort. Lalami clearly needed a way to get Sara home to end the book, and springing her in this way is consistent with the arbitrary nature of her incarceration and the mercurial behavior of her keepers. Upon reflection, though, it is more than this. Dystopian novels are not merely the expression of vivid imaginations, they are often warnings about what’s to come. So far no one is mining our dreams, though who is to say that won’t happen in the future—and for many of us, Trump and his attendants have already infected our sleep. The lesson of The Dream Hotel is not that it—which is to say surveillance and the demolition of civil liberties—can happen here or even that it is happening here. It is that dissent is still an option, if only, like Sara Hussein, we choose to take it.