Does Your AI Have a ‘Californian Soul’?
by SeongHyeok Seo
AAIH Insights – Editorial Writer

The contents presented here are based on information provided by the authors and are intended for general informational purposes only. AAIH does not guarantee the accuracy, completeness, or reliability of the information. Views and opinions expressed are those of the authors and do not necessarily reflect our position or opinions. AAIH assumes no responsibility or liability for any errors or omissions in the content.
(From Data Sovereignty to Emotional Sovereignty)
1. The Misunderstanding of Neutrality: AI Is Not Free from Values
AI is often understood as a mathematical and neutral tool. However, this perception fails to fully reflect how the technology actually operates.
Large Language Models (LLMs) are not born in a vacuum. They are trained on data, and they internalize the value systems of the environments where that data was accumulated. Currently, the training data for the world’s major language models is overwhelmingly based on Western, specifically Silicon Valley-centric, digital culture.
As a result, while today’s AI can speak fluent Korean, Japanese, or French, the default emotional setting for its judgments and reactions follows a value system formed in a specific region, generation, and industrial environment.
Attitudes that prioritize efficiency, optimization, and individual autonomy may be rational technically, but they cannot be assumed to be universal standards for emotional interaction. Users converse with AI in their native tongues, but in reality, they are interacting with an entity that holds a specific cultural emotional norm as its default.
2. Untranslatable Emotions: The Structural Exclusion of High-Context Cultures
Language is more than a means of information transfer; it contains the very way a society organizes emotion. East Asian cultures, including Korea, belong to high-context cultures, where meaning is often understood through relationship and context rather than explicit expression.
In these cultures, two concepts play a critical role: Jeong (情) and Nunchi (눈치). These are not merely styles of emotional expression but come closer to being the social operating principles by which relationships are maintained and adjusted.
2-1. A Minimal Understanding of Korean Emotional Structure
(Jeong and Nunchi are not emotions, but social operating principles)
To understand Korean society and the East Asian cultural sphere, one must grasp two core concepts that cannot be fully translated into Western emotional terms: Jeong and Nunchi.
Jeong (Jeong) cannot be equated to the Western concepts of ‘Love’ or ‘Friendship.’ Rather than individual fondness or selective intimacy, it signifies a relational continuity stemming from the recognition that “you and I are not strangers.” Jeong is not formed overnight; it includes the trust, responsibility, and implicit obligation of care that accumulate over shared time.
In this context, the comfort expected in Korean society is less about problem-solving or advice, and more about a gesture of empathy—perhaps unrefined, but carrying heavy emotional weight—that signals the relationship is intact.
Nunchi (Nunchi) is often misunderstood as intuition or sense, but it is actually a highly trained social intelligence. It is the ability to read signals that have not been explicitly spoken, interpreting the context of the conversation, the atmosphere, and the length of silence to judge whether to speak now or wait.
Nunchi is not the ability to guess emotions, but a cognitive skill to coordinate the rhythm of relationships.
The commonality between these two concepts is clear: Meaning does not exist solely in speech. Meaning is dispersed within silence, empty space, and the accumulated time of the relationship. Therefore, communication in these cultures values context maintenance and timing adjustment over clarity and speed.
If an emotional response model optimized for a low-context culture is applied without understanding this structure, the AI’s reaction may be accurate, but emotionally, it is highly likely to be perceived as an intrusion.
3. Standardization of Emotion and the Limits of Sovereign AI Discourse
This phenomenon is not merely a problem of a technological gap. When specific emotional norms spread and become fixed globally—even unintentionally—it creates a structural imbalance. We can call this “Emotional Colonization.”Here, colonization does not mean explicit rule, but a state where standardization without alternatives is repeated.
Recent discussions on ‘Sovereign AI’ in various nations focus primarily on data storage locations, infrastructure ownership, and building national models. However, this approach is still insufficient.
The core of sovereignty lies not in where the server is located, but in what emotional standards the AI uses to respond. If the way the next generation learns empathy through interaction with AI becomes fixed to the emotional norms of a specific culture, the emotional identity of that society will inevitably shift in the long term.
4. The Design Challenge: Emotional Localization
The solution to this problem does not lie in rejecting technology. We should utilize the most advanced global intelligence, but when that intelligence operates within a specific society, it must be structurally designed to respect the emotional norms of that culture.
What is needed for this is Emotional Localization, which goes beyond language translation.
Emotional Localization is not an argument to apply ethical principles differently for each culture. It is a demand to maintain universal ethical standards while separating the manner of emotional expression and reaction into an execution layer that can be adjusted by culture.
This layer must be able to modulate the AI’s attitude:
- In Western societies, with a rhythm that respects individual choice and autonomy.
- In East Asian societies, with a rhythm that considers relational harmony and context.
This is not a localization option, but an architecture-level requirement.
5. Conclusion: Technology Crosses Borders, But Empathy Needs Context
For AI to establish itself as a truly global tool, it must, paradoxically, understand the most local emotions. Intelligence stripped of emotional diversity may be efficient, but it cannot accumulate trust.
Trust does not collapse due to factual errors. In most cases, it collapses when timing and tone are misaligned.
AI ethics must not stop at defining what not to say. It must address how to speak, when it is right to remain silent, and upon which cultural emotional foundation those judgments are based.
Emotional Sovereignty is not a discourse on cultural protection. It is a design condition that must be met for global AI to gain sustainable trust.

