Blog

AI hallucinations in legal practice: causes and technical solutions from PyleHound
In the field of artificial intelligence, the term “hallucination” refers to the phenomenon whereby a model outputs information that sounds convincing but is in fact incorrect or fictitious. How does PyleHound technically ensure that you can rely on the search results?
More Stories

AI as a lawyer: What is permitted? The guidelines from BRAK & DAV provide clarity
Both publications send the same basic message: AI should be used – but in a well-organized manner.

DAV statement: Use of AI compatible with the legal profession
With its current statement No. 32/2025, the German Bar Association (DAV) has sent a strong signal regarding professional law: The use of artificial intelligence (AI) in law firms is not only possible, but also legally unobjectionable, provided that technical and contractual protective measures are observed.
