Pragmatism always wins the day, and with that being said this idea has plenty of merit. The only concern we have with Kafka as the "source of truth" is that it is probably not the place to store business critical data for many many years. I think Kafka has it's place as:
So, building off of what you probably already know and the second bullet, having a true database as the single source of truth is probably best for the long haul because that's for what it was built and optimized. Kafka can definitely act as a "secondary" source of truth which would optimally mirror the single source except in rare circumstances. In other words Kafka is best as a temporary source of truth, depending on the TTL of the topic, noting that event data should be written for critical data as raw immutable for various replay reasons, not just recovery (at least from a data perspective).
Of course as mentioned, there are those who would debate the Kafka position as just as valid, so it ultimately falls on you to make the decision. That said, we do not believe Kafka should be relied on in this way at this time (for the long haul records).
Additional Thoughts
Another thing that is kind of implicitly in the Lagom design is the question of who owns the data schema? Any time you have a communication channel between services, then it restricts the owning service from making certain types of changes. In Lagom, it intentionally decouples the internal representation owned by the service from the published representation(s) in the Kafka topic. This may seem objectionable at first because it seems like extra boilerplate and redundant data types, but then most services will quickly run into cases where these need to be different, or you need to evolve the internal representation in some way that isn’t transparently compatible for consumers. One example is as soon as you have some kind of `UserCreated` event with a password hash. You may want your user service to publish the fact that a user was created to other services, but not share the password hash. You could accomplish this with Kafka only if you have some topics that are “internal” and others that are for external consumers, however it might prove too great a temptation to take the shortcut and have consumers read from the internal topic. And of course using a database as the source of truth doesn’t innately prevent this problem, but with Lagom, the clear delineation between database as the internal event log and Kafka as the message bus makes it easier to avoid that problem.