There’s no way to develop realistic recommendations to improve trust in the media and democracy…

There’s no way to develop realistic recommendations to improve trust in the media and democracy without beginning with a slightly pedantic and intellectually honest conversation regarding the “pageantry of reality” in American media.
The complexity of reality and it’s relationship to truth and objectivity makes this a really difficult topic to appropriately address as, realistically, there is no way for each of us to escape our individual perception of reality — this includes the practitioners of journalism.
We all possess attitudes, beliefs and values; arguably, these attributes are derived from a combination of our nature and our nuture.
Political beliefs are inherent to being human; our political beliefs are opinions, not truth.
As it pertains to the political, truth is more complex than belief because in order to understand truth, we need to understand and delineate the values that drive our beliefs about what should, must and ought be in our politics. Our values are derived from what we believe is important; our values shape meaning.
Without this understanding, we lack the ability to understand how our disagreements about “truth” in our politics derive from a lack of shared meaning.
Purposeful deception is a lie and, to misdirect the meaning or to convey a false impression is to lie, but what if core and underlying meanings and assumptions are not shared and, they were never agreed upon? How does that complicate “the truth”?
In politics, truth requires deep context because, politically as Americans, we do not create and share meaning via the same attitudes, beliefs, values and core assumptions about what should, must and ought be in the solutions to the social and economic challenges that we encounter as Americans.

Ultimately, in terms of specific recommendations to improve trust in the media and democracy, who gets to decide what is best is an interesting and challenging question to begin with.
Who knows best? Why is this group of Knight commissioners the best group of people to understand and make recommendations to address this incredibly complex problem? From what or whose account is their expertise and authority derived? Do we all agree that there is a problem that need to be addressed?
If so, what is the problem? What caused it? Who is to blame? What will happen if we do not address the problem?
Why should the American media businesses and social media/ social networking service companies respond to these recommendations? How do the recommendations impact their profitability? Do the recommendations require governmental regulatory intervention?
Beyond these questions, here are a few additional factors to consider:
1. In politics, risk and blame are perceived in divergent ways; this leads to factions.
2. The best we can do to address the reality of factions is to establish ways to force the recognition of the various factional accounts and competitive interests in light of one another in media as it relates to democracy.
The irony here is that the Fairness Doctrine (1947–1987)required that factional accounts of reality were presented side by side but later this was determined to have a “chilling effect on free speech”’.
When the balance between fairness and free speech needs to be considered in light of regulation, the topic becomes political.
3. Ultimately, we have to wrestle with the raw fact that law emanates from the sovereign and recommendations may have no effect. Recommendations are easy to ignore.
It is important to acknowledge that there is a vast difference between doing what’s right to ensure a corporation’s profitability and doing what is right for America.
4. In democracy the faction that wins gets to decide until the balance of power is shifted.
5. Single narratives are dangerous; multiple narratives exist because divergent interests exist.
The acknowledgement of these facts highlight the importance of dialogue, as opposed to monologue.
6. Journalism, in news as it relates to politics, needs to be able to account for why and how humans perceive the world differently; Artifical Intelligence (AI), the science of psychometrics and psychometric analytical tools can help greatly in this endeavor.
7. Journalism has only just begun to understand what AI and the science of psychometrics knows about human systems of belief; for this reason, politics, marketing and the profession of public relations are currently gaming the systems of reporting in news as it relates to politics.
For example, as the news directly reports what a politician says and, AI-enabled tools are able to determine the best phrases and talking points for polticians to use to yield emotional responses in their target audiences; a politician uses targeted phrases and talking points and, they reach their target audience, prompting emotional responses in their audience with no intervention from other narratives; in these instances, the single story yields it’s most powerful effect.
And, the ethical question is: Is this innocent “free speech” or are these AI-enabled monologues psy-ops on the American people? In this context, the question of fairness as a operative component of an active democracy becomes a political question.
8. Therefore, the challenge before the Knight Commission on Trust, Media and Democracy is deeper than simply how to hold institutions accountable for how information is collected and spread online.
American politics and the practices of journalism have been forever altered by AI-enabled tools and, journalism has not adjusted in acknowledgement of these new methods and practices in politics.
Going forward, journalism needs to operate with a better understanding of how artifical intelligence is being used behind the scenes to impact American political outcomes.
Through data scraping methods, information is collected of what people like, share and post on social media platforms like Facebook and in combination with other information like gender, age, place of residence this data can be used to help researchers and speech writers make correlations between social media data and what voters believe and want to hear.
AI-powered, big data enabled, psychographic insight tools are being used in marketing, advertising and public relations to understand and shape both, consumer and voter behavior.
In this context, Artifical Intelligence is being used in combination with the Big Five personality traits model, also known as the five factor model (FFM), to develop talking points and phrases in persuasive arguments (aka rhetoric) that yield highly-emotionally effective politically-oriented speech, writing and advertising. These same persuasive arguments are also being used by both polticians and their operatives in news and social media as it relates to politics to impact elections.
Therefore, new tools and standards are needed for journalism as it engages with American politics to understand and address the strategic advantage offered by using artificial intelligence for the purposes of public relations and persuasive arguments (rhetoric) in news as it relates to politics.
This is especially important as it relates to the use of new psychometric analytical AI-powered tools sets, such as Cambridge Analytica and Crimson Hexagon, that can be used by political campaigns as well as by active American politicians and their spokespersons to evoke emotional responses from audiences and utlimately, effect political outcomes through highly targeted political speech that is sourced and derived via AI-powered, big data enabled, psychographic insights.