{"id":24407,"date":"2025-12-18T09:52:31","date_gmt":"2025-12-18T12:52:31","guid":{"rendered":"https:\/\/justen.com.br\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/"},"modified":"2026-02-05T16:53:14","modified_gmt":"2026-02-05T19:53:14","slug":"digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth","status":"publish","type":"artigo_pdf_est_2adv_","link":"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/","title":{"rendered":"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH"},"content":{"rendered":"\n<h4 class=\"wp-block-heading\"><strong><strong>1. Introduction<\/strong><br\/><strong><strong>1.1. Types of Artificial Intelligence <\/strong><\/strong> <\/strong><\/h4>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">As artificial intelligence systems evolve and become integrated into our routines, their potential and utility also become evident in the legal context. Its use as an assistant in dispute resolution manifests itself, above all, in the speed and efficiency with which its tools can execute and improve tasks such as the management, review, and translation of documents, legal research (Jus Mundi, LexisNexis, CoCounsel, Harvey), as well as the analysis and presentation of evidence. Currently, many legal professionals have begun to use AI to assess the strength of evidence, estimate the probability of success in lawsuits, freeing up time for a more strategic performance by lawyers.  <\/p>\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<h4 class=\"wp-block-heading\"><strong><strong><strong><strong>1.2. AI in Arbitration<\/strong><\/strong><\/strong><\/strong><\/h4>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Despite its predominantly individual use and external to the procedure, AI tools raise relevant concerns regarding the confidentiality and security of information. Arbitration involves sensitive data, documents, procedural strategies, identity of parties and third parties, which can be inadvertently exposed when entered into platforms operated by third parties or by models that operate on external servers. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Depending on the architecture of the tool and the volume of data shared, the use of systems such as generative models may even require prior consent from the parties, since certain information may leave the exclusive domain of the user and become part of external databases. As there is still no clear regulation on the limits of this sharing, uncertainty zones persist that can generate procedural incidents, disputes over the validity of acts, and questions about the integrity of the procedure. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">In contrast, technologies used in the process, that is, formally incorporated into the procedural rules and executed as part of the rite, present challenges of a different nature. By assuming functions typically performed by subjects in the process, such as conducting hearings or technical analysis of documents, such tools can impact structuring principles of arbitration, especially due process, equality, impartiality, and freedom of conviction. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Although they can rationalize specific steps, they do not alter the central logic of the procedure, which continues to depend on human choices, methodological validations, and the procedural dialogue itself between the parties and the tribunal. In practice, more sophisticated systems tend to introduce new layers of debate (on algorithmic opacity, biases, admissibility and probative force of automated results, in addition to the security of the data processed), which can increase the duration or costs of arbitration, and not necessarily reduce them. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">On the other hand, already consolidated solutions for digitization and virtualization (such as electronic protocols, procedural management platforms, and remote hearings) demonstrate measurable impacts of reducing time and costs. The same does not occur with advanced AI tools: despite being promising, they remain of restricted use, economically variable, and dependent on strong human supervision. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Thus, true efficiency in arbitration continues to be conditioned on a realistic assessment of the cost-benefit ratio of available technologies and the maintenance of human control over acts that influence the conviction of the judges (SCHERER, 2019).<\/p>\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<h4 class=\"wp-block-heading\"><strong><strong><strong><strong><strong>2. Impacts of AI on the production of evidence in Arbitration<\/strong><br\/><strong><strong>2.1. On the need for regulation<\/strong><\/strong><\/strong><\/strong><\/strong><\/strong><\/h4>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Despite artificial intelligence proving to be a powerful tool for the production and management of evidence in arbitration, the impacts that its use may have on specific cases cannot be ignored. Therefore, the construction of specific normative and guiding parameters is essential, in order to prevent the indiscriminate adoption of these technologies from compromising the authenticity of the evidence, due process, and, ultimately, confidence in the arbitral procedure itself. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">This risk is particularly sensitive when considering the so-called deepfakes, capable of creating false evidence or manipulating existing evidence, making it extremely difficult to differentiate what is authentic from what was generated or adulterated by AI. Reliance on falsified evidence can distort the outcome of arbitrations and, in extreme situations, lead to true miscarriages of justice, in addition to impacting additional time and costs when the tribunal needs to resort to experts to verify the legitimacy of digital evidence. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">For this reason, recently, several normative texts from arbitral institutions or other organizations have been published, such as the CIArb Guideline on the Use of AI in Arbitration, according to which the Party intending to use a Permitted AI Tool shall use reasonable efforts to independently verify the sources and accuracy of the results obtained, correcting any errors before submitting documents or other evidence and, if necessary, also throughout the procedure (CIArb, 2024). On the other hand, the use of AI to produce content capable of inducing the tribunal or the opposing party into error is expressly prohibited, whether through the fabrication or adulteration of evidence, or by formulating commands (\u201cprompts\u201d) designed to deliberately generate an untrue or biased result (CIArb, 2024). <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">From a similar perspective, the SVAMC Guidelines on the Use of Artificial Intelligence in Arbitration establish, in an orientative character, in clause 5, that the parties must respect the integrity of the procedures, and the use of AI to falsify evidence is not allowed, so that the authenticity of the evidence and the arbitral procedure is not compromised (SVAMC, 2024).<\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">Similarly, the VIAC Note on the Use of Artificial Intelligence in Arbitration Proceedings acknowledges that it is up to the arbitrators, in the exercise of their discretion, to decide whether it is necessary to require the disclosure of evidence produced with the support of AI, as well as to determine the admissibility, relevance, materiality, and probative value of such elements (VIAC, 2024).<\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">The SCC Guide to the Use of AI in Cases Administered under the SCC Rules indicates that these tools should be equipped with solutions to evidence and detect content that has been generated by AI or manipulated by it, using reliable methods for this purpose (SCC, 2024). Similarly, Leonardo F. Souza-McMurtrie highlights that artificial intelligence itself is one of the most promising solutions to the problem of evidence falsification, insofar as it can be trained to identify it (Souza-McMurtrie, 2023). <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">In addition to the use of AI by the parties, the need to regulate its use by the arbitrators themselves is also discussed, indicating clear parameters for action. In general, the conditions imposed for the use of AI emphasize a combination of transparency and personal responsibility of the human operator, who remains responsible for supervising the result generated by the tool and for responding for its content. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">In this context, perhaps the most sensitive rules are those related to the treatment of confidential information or trade secrets. Some courts have taken special care to prevent the simple presentation of pleadings and documents from resulting in the disclosure of sensitive data, especially when using widespread programs such as ChatGPT. For example, it is required that briefs containing confidential or exclusive information expressly identify these excerpts (for example, in parentheses or highlighted), that a non-confidential version of the document with the sensitive information suppressed be presented, and that the recipients of the confidential briefs refrain from disclosing their content to any subjects not authorized to receive it. Even though the responsible companies claim that there is no risk of undue access, it is feared that such systems may retain sensitive data entered by users and that, under certain circumstances, this information may become accessible to unauthorized third parties, in violation of the duty of confidentiality commonly existing in arbitration.   <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">These cautions, however, in line with the CIArb Guideline on the Use of AI in Arbitration (2024, item 4.5), should not prevent the private use of AI tools by the parties and their teams in internal activities that do not interfere with the progress or integrity of the procedure, since arbitrators should not regulate this type of use.<\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">Therefore, it is noted that, with the proper regulations, the use of artificial intelligence in arbitral procedures can constitute an important instrument for improving and evolving arbitration, provided that a set of guidelines is observed that preserve the good faith and credibility of the procedure. Notwithstanding the regulatory advances, there are still gaps in each of these instruments, which is why such normative frameworks have much to evolve \u2013 as Leonardo F. Souza-McMurtrie highlights, arguing that the current guidelines are still premature, and may disorganize a process under construction and, with that, hinder the emergence of better solutions, which is why they must be continuously revisited in light of practical experience (Souza-McMurtrie, 2025). <\/p>\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<h4 class=\"wp-block-heading\"><strong><strong><strong><strong>2.2. Authenticity of evidence<\/strong><\/strong><\/strong><\/strong><\/h4>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Once the analysis on the need to regulate the use of artificial intelligence in arbitration is overcome, a second axis of concern is imposed: the authenticity of the digital evidence produced or processed by these tools. In a context in which the decision of the arbitral tribunal depends, to a large extent, on the confidence in the evidence presented, any uncertainties regarding the integrity of this material directly affect the legitimacy of the result. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">In this scenario, the creation of false evidence emerges as one of the main threats associated with the use of AI in arbitration, especially in the production of evidence intended for the arbitral tribunal: the same technology that allows organizing, searching, and analyzing large volumes of data can also be used to create false evidence or adulterate authentic content, as in <em>deepfakes<\/em>.<\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">On the other hand, the very distrust of certain types of evidence opens space for what Rebecca Delfino called \u201c<em>deepfake<\/em> defense,\u201d a strategy by which lawyers use skepticism about the integrity of digital evidence to cast doubt on its legitimacy, even if it is, in fact, authentic (Delfino, 2023). This creates an environment in which the arbitrators&#8217; apprehension about the possibility of technological manipulation is exploited to challenge the credibility of practically any evidence, at any time. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">Guidelines such as the CIArb Guideline on the Use of AI in Arbitration, although already recognizing risks associated with AI in the production of evidence, do not directly address specific strategies such as the so-called \u201cdeepfake defense,\u201d located at the intersection of technology, law, and ethics, which aggravates the difficulty of tribunals in dealing with this new type of allegation (CIArb, 2024).<\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">For these reasons, it is suggested (Limond; Calthrop, 2025) that digital evidence be accompanied by a declaration of authenticity signed by the lawyer or a specialized technical opinion that attests that it has been examined and considered authentic and reliable, without removing the responsibility of the parties, in light of the principle of good faith, to ensure the authenticity and legality of the evidence and arbitration documents they present.<\/p>\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<h4 class=\"wp-block-heading\"><strong><strong><strong><strong>2.3. Burden of proof<\/strong><\/strong><\/strong><\/strong><\/h4>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">The use of artificial intelligence in the production, organization, and analysis of evidence significantly modifies the distribution of the burden of proof in arbitration. Digital evidence mediated by algorithms ceases to be a simple document presented to the tribunal and begins to involve technical processes that require human supervision, methodological transparency, and, when necessary, explanations about its origin, traceability, and integrity. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">This additional burden arises from factors such as algorithmic opacity, a characteristic of models whose internal functioning is not verifiable, the potential for automatic reconstruction of content by generative tools, and the technical inequality between the parties, which can compromise the exercise of due process. In this environment, the notion of continuous human responsibility applies: although AI participates in the evidence production chain, the responsibility for verifying, reviewing, and validating the content remains entirely human. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">The most recent guidelines from arbitral institutions reinforce this understanding. The CIArb Guideline on the Use of AI in Arbitration requires parties to indicate when they use AI to produce or process evidence and to independently verify the accuracy of the results (CIArb, 2024). The SVAMC Guidelines reiterate that technology cannot compromise the authenticity of the evidence (SVAMC, 2024). The VIAC Note recognizes the power of arbitrators to request supplementary information about the methods employed (VIAC, 2024), while the SCC Guide recommends the use of tools capable of detecting artificially generated or manipulated content (SCC, 2024). Together, these guidelines establish that the party that introduces technology into the procedure is responsible for demonstrating that it has not compromised the integrity of the evidence.    <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">AI also generates hypotheses of shifting the burden of proof. This phenomenon occurs when the opposing party presents a plausible doubt about the authenticity of the material, especially in a context marked by deepfakes and sophisticated digital manipulation tools. In these situations, it is up to the party that presented the evidence to explain its process of obtaining it and demonstrate the digital chain of custody.  <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">This set of factors reinforces the importance of preserving the digital chain of custody. Evidence produced or processed by AI must be accompanied by information about the tool used, parameters applied, model versions, and modification phases. <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">In summary, artificial intelligence does not diminish the burden of proof in arbitration, but transforms it. The party that makes use of AI assumes expanded responsibility for demonstrating the authenticity, integrity, and reliability of the evidence presented, and must ensure adequate human supervision, methodological transparency, and complete traceability of the material. In a scenario characterized by sophisticated possibilities of digital manipulation, risks of algorithmic error, and relevant technical asymmetries, the burden of proof accompanies the technological risk introduced into the procedure, reaffirming that technology expands, and does not replace, the evidentiary responsibility of the parties.  <\/p>\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<h4 class=\"wp-block-heading\"><strong><strong><strong><strong><strong>2.4. Limits of judicial cooperation in the search for truth<\/strong><\/strong><\/strong><\/strong><\/strong><\/h4>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">The complexity of digital evidence and the risk of manipulations by artificial intelligence can lead the parties, in exceptional situations, to seek the support of the Judiciary to preserve or enable the production of evidence in aid of arbitration \u2013 for example, through urgent injunctions, early production of evidence, or acts that depend on state coercive powers, especially in the face of third parties not subject to the jurisdiction of the arbitral tribunal (Lew; Mistelis; Kr\u00f6ll, 2003).<\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">However, this cooperation has well-defined limits, as arbitration remains an autonomous procedure, founded on competence-competence and the responsibility of the arbitrators for conducting the instruction. State intervention cannot replace the evidentiary assessment of the arbitral tribunal or serve as an indirect route to re-discuss issues of merit (Lew; Mistelis; Kr\u00f6ll, 2003). <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">As the classic doctrine highlights, judicial intervention in arbitration should remain restricted to support measures intended to supply coercive powers that arbitrators do not possess, such as ordering the preservation of evidence or the delivery of documents held by third parties. In these situations, state courts act only to make the arbitral process effective, without replacing the evidentiary judgment of the arbitral tribunal or interfering in the conduct of the merits (Lew; Mistelis; Kr\u00f6ll, 2003). <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">Another relevant limit is confidentiality. The submission of sensitive documents, technical metadata, or strategic business information to a judicial process, which is generally public, can compromise duties of confidentiality assumed by the parties in arbitration. Therefore, the activation of the Judiciary should be exceptional and, when inevitable, accompanied by measures that reduce the risk of undue exposure.  <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">It is not generally accepted that judicial cooperation be used strategically to transfer to the Judiciary the task of verifying the authenticity of digital evidence or endorsing generic allegations of manipulation by artificial intelligence. In these situations, the discussion must be submitted to the arbitral tribunal itself, which is responsible for conducting the instruction and assessing the evidence. Only in extreme cases, in which it is argued that the sentence was rendered based on false evidence, is there room for judicial control of the report, within the narrow limits of art. 32 of the Arbitration Law, as illustrated by recent decisions involving arbitrations with the participation of the Public Administration (AGU, 2023).  <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify\">The legitimate space for judicial cooperation is therefore restricted to cases in which the effectiveness of the arbitral procedure depends on acts that require external coercive powers, such as orders directed to third parties, urgent preservation of volatile data, or preventive measures to avoid the destruction of evidence. Even in these cases, state action must be instrumental, always subordinate to the decisions of the arbitral tribunal and without invading its decision-making sphere. <\/p>\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<h4 class=\"wp-block-heading\"><strong><strong><strong><strong><strong>3. Conclusion<\/strong><\/strong><\/strong><\/strong><\/strong><\/h4>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Many argue that regulation is necessary precisely to preserve the human characteristics of the dispute resolution process, without, however, stifling innovation. The main arbitral institutions have already been establishing important rules for the use of artificial intelligence, but regulation must be thought of in adequate and progressive doses, as has occurred with other technologies throughout history (CIArb, 2024; SVAMC, 2024; VIAC, 2024; SCC, 2024). <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">The technological race of large companies is irreversible, and those concerned with due process in arbitration need to recognize, simultaneously, the benefits and risks of the integration of artificial intelligence. In several jurisdictions, such as the Swiss Federal Tribunal and the English High Court, the analogy with the tribunal secretary offers a promising path: although useful, he continues to be an assistant, limited by the arbitrator&#8217;s guidelines and the fundamental principles of arbitration (Swiss Federal Tribunal, Case 4A_709\/2014, 2015; P v Q [2017] EWHC 194 (Comm)). Similarly, artificial intelligence can and should be understood as a powerful tool, capable of streamlining procedures, offering insights, and even assisting in the drafting of decisions, but always kept in the position of technical assistant, subordinate to the final discretion and critical judgment of the human arbitrator (Lew; Mistelis; Kr\u00f6ll, 2003; Scherer, 2024).  <\/p>\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n<p class=\"has-text-align-justify texto-justificado\">Ultimately, the credibility of arbitration in an environment marked by disinformation and increasing sophistication of digital manipulations will depend less on the speed of technology and more on the ability of the parties and arbitrators to control it, audit it, and use it without abdicating probative rigor. Artificial intelligence can strengthen the arbitral procedure, provided that it is incorporated in a responsible, transparent, and proportional manner (Souza-McMurtrie, 2023; Delfino, 2023). <\/p>\n","protected":false},"author":5,"featured_media":0,"template":"","categories":[534],"tags":[535],"ppma_author":[434],"class_list":["post-24407","artigo_pdf_est_2adv_","type-artigo_pdf_est_2adv_","status-publish","hentry","category-edition-226-december-2025","tag-artigopdf","entry"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH &#8212; Advogados<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH &#8212; Advogados\" \/>\n<meta property=\"og:description\" content=\"1. Introduction1.1. Types of Artificial Intelligence As artificial intelligence systems evolve and become integrated into our routines, their potential and utility also become evident in the legal context. Its use as an assistant in dispute resolution manifests itself, above all, in the speed and efficiency with which its tools can execute and improve tasks such&hellip; Continue reading DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH\" \/>\n<meta property=\"og:url\" content=\"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/\" \/>\n<meta property=\"og:site_name\" content=\"Advogados\" \/>\n<meta property=\"article:modified_time\" content=\"2026-02-05T19:53:14+00:00\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"14 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/justen.com.br\\\/en\\\/artigo_pdf_est_2adv_\\\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\\\/\",\"url\":\"https:\\\/\\\/justen.com.br\\\/en\\\/artigo_pdf_est_2adv_\\\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\\\/\",\"name\":\"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH &#8212; Advogados\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/justen.com.br\\\/#website\"},\"datePublished\":\"2025-12-18T12:52:31+00:00\",\"dateModified\":\"2026-02-05T19:53:14+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/justen.com.br\\\/en\\\/artigo_pdf_est_2adv_\\\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/justen.com.br\\\/en\\\/artigo_pdf_est_2adv_\\\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\\\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/justen.com.br\\\/en\\\/artigo_pdf_est_2adv_\\\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"In\u00edcio\",\"item\":\"https:\\\/\\\/justen.com.br\\\/en\\\/justen-pereira-oliveira-talamini-2\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Artigo PDF 1 Est | 2 Adv (Orienta\u00e7\u00e3o)\",\"item\":\"https:\\\/\\\/justen.com.br\\\/artigo_pdf_est_2adv_\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/justen.com.br\\\/#website\",\"url\":\"https:\\\/\\\/justen.com.br\\\/\",\"name\":\"Advogados\",\"description\":\"Justen, Pereira, Oliveira &amp; Talamini escrit\u00f3rio de advocacia\",\"publisher\":{\"@id\":\"https:\\\/\\\/justen.com.br\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/justen.com.br\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/justen.com.br\\\/#organization\",\"name\":\"Advogados\",\"url\":\"https:\\\/\\\/justen.com.br\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/justen.com.br\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"\",\"contentUrl\":\"\",\"caption\":\"Advogados\"},\"image\":{\"@id\":\"https:\\\/\\\/justen.com.br\\\/#\\\/schema\\\/logo\\\/image\\\/\"}}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH &#8212; Advogados","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/","og_locale":"en_US","og_type":"article","og_title":"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH &#8212; Advogados","og_description":"1. Introduction1.1. Types of Artificial Intelligence As artificial intelligence systems evolve and become integrated into our routines, their potential and utility also become evident in the legal context. Its use as an assistant in dispute resolution manifests itself, above all, in the speed and efficiency with which its tools can execute and improve tasks such&hellip; Continue reading DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH","og_url":"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/","og_site_name":"Advogados","article_modified_time":"2026-02-05T19:53:14+00:00","twitter_card":"summary_large_image","twitter_misc":{"Est. reading time":"14 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/","url":"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/","name":"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH &#8212; Advogados","isPartOf":{"@id":"https:\/\/justen.com.br\/#website"},"datePublished":"2025-12-18T12:52:31+00:00","dateModified":"2026-02-05T19:53:14+00:00","breadcrumb":{"@id":"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/justen.com.br\/en\/artigo_pdf_est_2adv_\/digital-evidence-in-arbitration-in-times-of-disinformation-authenticity-burden-of-proof-and-limits-of-judicial-cooperation-in-the-search-for-truth\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"In\u00edcio","item":"https:\/\/justen.com.br\/en\/justen-pereira-oliveira-talamini-2\/"},{"@type":"ListItem","position":2,"name":"Artigo PDF 1 Est | 2 Adv (Orienta\u00e7\u00e3o)","item":"https:\/\/justen.com.br\/artigo_pdf_est_2adv_\/"},{"@type":"ListItem","position":3,"name":"DIGITAL EVIDENCE IN ARBITRATION IN TIMES OF DISINFORMATION: AUTHENTICITY, BURDEN OF PROOF, AND LIMITS OF JUDICIAL COOPERATION IN THE SEARCH FOR TRUTH"}]},{"@type":"WebSite","@id":"https:\/\/justen.com.br\/#website","url":"https:\/\/justen.com.br\/","name":"Advogados","description":"Justen, Pereira, Oliveira &amp; Talamini escrit\u00f3rio de advocacia","publisher":{"@id":"https:\/\/justen.com.br\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/justen.com.br\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/justen.com.br\/#organization","name":"Advogados","url":"https:\/\/justen.com.br\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/justen.com.br\/#\/schema\/logo\/image\/","url":"","contentUrl":"","caption":"Advogados"},"image":{"@id":"https:\/\/justen.com.br\/#\/schema\/logo\/image\/"}}]}},"_links":{"self":[{"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/artigo_pdf_est_2adv_\/24407","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/artigo_pdf_est_2adv_"}],"about":[{"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/types\/artigo_pdf_est_2adv_"}],"author":[{"embeddable":true,"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/users\/5"}],"wp:attachment":[{"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/media?parent=24407"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/categories?post=24407"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/tags?post=24407"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/justen.com.br\/en\/wp-json\/wp\/v2\/ppma_author?post=24407"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}