“Memory is life. It is subject to the dialectics of remembering and forgetting, unaware of its successive deformations, open to all kinds of use and manipulation. Sometimes it remains latent for long periods, then suddenly revives. Memory always belongs to our time and forms a lived bond with the eternal present.”
- Pierre Nora (Historian)
Prompted remembrance in the age of outsourced memory
A few days ago, Facebook prompted a notification. It reminded me of an old incident. It was about a photo shoot—about a decade ago.
Ten years! I'm less puzzled by the fast-passing of time, and a lot more surprised by the frivolity of remembrance in the digital age. The digital domain has successfully outsourced memory from a purely psychic space to an utterly virtual one—in a span of less than two decades. Memory has become obsolete, as there is no need to remember anything because everything can be digitally (re)stored. A social media prompt of a past event further attests to the abject surrender of internal human memory to external digital algorithms. Remembering is no longer a human act(ivity), but a manifestation of machine memory—mediated by various kinds of touch-mitigated screens and invisible networks.
In our rapidly-growing dense timelines on social media, we continually post fragmented memories (read updates) to make our private selves public. But how often, or how rarely, do we revisit earlier updates or the past? As if, the operating principle of the medium perennially promotes the idea of living in the present, where the current post pushes the past into oblivion. Remembering has become so easy, so effortless, so inexpensive and so heavily outsourced that one does not need to remember anything at all. Drives, cloud spaces, massive servers, and other such digital platforms perform the task of remembrance. From news to discourse; from data to statistics; from images to videos; from entertainment to infotainment—everything can be stored up there and recalled whenever necessary from any part of the world, provided there is internet connectivity.
Easy to remember; difficult to forget—the paradox of digital memory
Technology-mediated transformations have deeper social impacts. For centuries the task of remembering was challenging and difficult, and forgetting was rather obvious and easy. Now that equation and understanding have been altered. The relationship between remembrance and forgetfulness has been inverted. With the advent of the outsourced digital memory, remembrance is happening by default; and forgetting is quite impossible. Even if you forget, the digital-memory remembers, unless you have consciously deleted a selected part or a whole. Its exhaustive (infra)structure does not allow you to forget, even when you have happily forgotten. All the false promises of acche din, five-trillion-economy, and the return of black money can be replayed on YouTube or wherever they are uploaded. All the bright spots and bitterness on your timeline through images and words can be scrolled back. AI can prompt a bygone moment arbitrarily to take you to a trip of sudden flashback. Bygones are not bygones any longer unless you consciously press ‘delete’. But even then, it might be impossible to delete it from the device of all those others with whom it was shared, once upon a time.
And such is the paradox: we do not need to remember anything; however, there is a demise of forgetting. Outsourcing of the memory encourages forgetting, while it has become impossible to forget, precisely because memory has been dislocated from the brain to the virtual. In this mammoth and ever-expanding ability to forget nothing (once stored), the humane ability to organically forget—has been has been challenged. Or, we have lost organic connection with the past, as we try to remember or reconstruct it, digitally. Remembering was never as mechanical as scrolling backward to objectively stop at an exact date and time to figure out what happened. To remember, was to make an effort, to negotiate through heaps of ever-changing memories, and to recollect a moment or its lived experience. That subjective lost-and-found experience could activate the falsification of the past; fictionalizing the past; and romanticizing the past. What used to be a tough excavation, is now just an easy scroll-back, or an abrupt prompt.
As opposed to a mechanized digital prompt, there could be an image that was never uploaded on social media. But it would still follow the pathway of an easy opening of an image folder through a simple click, as opposed to a difficult attempt to recount and remember. Anything that is digitally captured, edited, and stored—would invariably have an ease of access and ease of recall built into it. It would then act as an instant memory aid. While an analogue image may also behave in a similar fashion, it used to be rare. Hence, it enabled the concentration of more memory around the making of the image and also the anecdotes around it. Digital, on the contrary, is not premised on scarcity, but on excess. Excess ensures paucity of memory per image, or around each image.
Earlier, you could hover and reinvent around the area of faint remembrance. Now, you can synthetically return to an image, text, sound, or video from the past that you may have totally forgotten. In that digitized past, everything is solidified. Nothing transforms due to the impact and influence of piles of memory deposits. The accurate and objective defies the inherent characteristics of human-memory—e.i., to decay, fade, blur, and vanish. ‘Finding’ has never been so easy in the history of humankind. Ctrl+F and other search options have made a mockery of finding while delivering convenience. What the digital-memory has achieved in a span of the last two decades—was unachieved by the analogue combine of print, paintings and photography, put together for thousands of years. Digital memory is also totally unaffected by forces of nature such as water, fire, air, or an earthquake. Therefore, the past is now present-proof and future-proof—as memory is increasingly externalized.
Digital remembrance altering the contours of forgetfulness
How do we make sense of this journey from forgetting over time to remembering everything that has been digitized? Has the architecture of digital remembrance altered the contours of forgetfulness forever? While we expire, will our digital footsteps outlive us? In the infinite storehouse of information, will there be no loss of details and gradual erosion? Is there room for fading, blurring, and rusting? Has human(e)-forgetfulness become obsolete? Is ‘time’, at all a decisive factor, in the process of forgetting—since digital has enabled objective recalling of any digitally achieved moment so easy? Is biological-forgetting, all but redundant, by now?
From time-space navigation to information-recalling; from demographic data to technical-knowhow; from recipe to therapy; from government to corporates; from books to movies; from cab booking to bathroom cleaning—everything seems to have fallen under the scheme of systematic digital archiving and accessing. That easy access, which is also an obvious excess—seems to ridicule the romance of remembrance. Annihilation of analogue and subversion of the human memory—seems to be complete with the mass dissemination of machine-memory.
Without romanticizing that loss or lack, it is time to arrive at a poignant point: what about the individual’s desire to forget? Even though currently, a very minuscule part of our professional and personal life, daily activities, and recurring ideas are digitally archived, that day is possibly not too far, when second-by-second capturing of our everyday life will be possible, or become mandatory with increasing digital affordance. And with a small touch, scroll, or click—it will be possible to retrieve, relay, recall, and revise an entire life—selectively or entirely! Recollection of personal history will soon become, entirely digitized—mediated by outsourced memory.
Even though the combined forces of social media, computer memory, CCTV footage, and cloud spaces manage to document a small segment of our lived experiences and uttered opinions, that day is not too far, when digital abstinence will not be an option, any longer. The spree to publicise anything and everything ‘private’, demands more and more exposure, only to be used by e-commerce vendors, search engines, and social media platforms to capitalize commercially. In the thick information network, we netizens are willingly providing the information and cancelling out the chances of being a digital recluse. On one hand, this is compulsive and capitalistic and on the other hand, it is a calling.
Personal panopticons, self-disclosures and claim to memory
The costly panoptical model has been successfully replaced by voluntary self-servitude, where the netizens carry personal panopticons (read smartphones) attached to their bodies. That makes surveillance more private, more predictable, and more intimate. Memory is not extracted, but it is voluntarily supplied by willing agents. At every gathering, at every dinner, at every insignificant drop-of-the-hat-moment, we feel compelled to announce our current status. It demands an enhanced appetite for digital likes, comments, and validation. We are not just passive recipients, but maniac generators of content around ourselves—no matter how mundane, monotonous, and repetitive it is. That is our claim to memory. We have started owning not through subjective remembrance but through objective image-making. Even though it is meant to be forgotten too soon, just like any other useless update.
Or, for a certain class, food, for example, has ceased to be a source of nutrition or taste, on certain occasions. Like other forms of consumption, it has to be sold as a multi-layered visual experience that's Insta-worthy, no matter how bad it tastes. The look and feel of it has to be seductive enough for instantaneous and compulsive sharing before digging in. It must reach out for likes and comments—before reaching the stomach. It's a liberalized hunger for validation. It's an all-consuming appetite for colour-coordinated frills around a simple fish. Its shape, texture, and presence on the plate, await to be dissolved and digested—not by enzymes, but by disclosures. We are now eating for Insta.
Sociologist Zygmunt Bauman has brilliantly framed this postmodern condition as: “The fear of disclosure has been stifled by the joy of being noticed”. It is indeed a journey from self-censorship to self-exposure. Obtaining, controlling, and capitalizing voluntarily given information—also known as data mining—is an established global industry, by now. In this scheme of things—all users are potential customers. Each search, click, and preference becomes a clue for more prompts. Each browsing history is an addition to the mammoth database that provides a cue for yet another product recommendation. In this unequal scheme of things, there is no reciprocal transparency, as the user has no access to the exploitative algorithm and its internal mechanisms. The flow of information is inherently unequal, if not absurdly in favour of the data-extractors. And in that uneven flow, everyone is a consumer waiting to be consumed, waiting to be (consum)mated and cramped by the logic of remembrance that wants you to forget everything, eventually. The same logic of non-remembrance has snatched memory from you. Memory has been strategically dislocated.
While we flirt with our self-image, we are likely to witness a total reversal of the relationship we have with remembrance and forgetfulness with the assistance of AI. The basic operative principle of which is likely to remain the same: forget everything; remember nothing in your brain. Conversely, nothing will be forgotten and everything will be restored or remembered digitally. Empty brain cells fill up the racks of outsourced-digital-memory. It is a great era for those who have cribbed about weak memory. As terabytes and gigabytes take over, one really ponders what will human beings do with the recently vacated memory space in the brain!
It is indeed a curious moment in a time fraught with the anxiety of an AI takeover. For all you know mediation of touch will soon be obsolete. Voice instruction has already made an entry. In the next leap perhaps, you will think, and the gadget will be able to read your mind and act on your behalf without the mediation of touch, gesture, or any other physical command! After all, the ultimate triumph of the human brain is to discover machines that perform inhuman acts effortlessly.