Characteristics of a secure system
A communication system has to provide certain guarantees to be considered secure. While there are certainly many characteristics that are desirable in a communication system, the requirements for a secure system can be summarized as confidentiality, integrity, and authenticity. All three of these characteristics have to hold; if one of them is weak or even missing, the whole system cannot be considered secure.
Security has to be distinguished from clandestinity. A secure system is not required to hide the fact that communication is taking place or to obfuscate its participants. That would constitute a different requirement on a communication system.
These requirements also do not include mechanisms to contain the consequences of a breach. Perfect forward secrecy, future secrecy, plausible deniability are all methods to limit the effects of a compromise, how much an attacker gains by knowing the encryption secret (the key) at some point in time. For a system to be called secure, it's not necessary to implement these measures.
Although this definition of security might seem very limited, and real world applications will certainly implement many of the other characteristics mentioned, when reasoning about a system, it helps to have clear-cut definitions, and doesn't include neighboring concerns.
Confidentiality
Confidentiality means that only the intended recipients of a message are able to read it. Everybody else should only see an indecipherable mass of seemingly random characters. This is where encryption algorithms are used. For an in-depth discussion of encryption, see the background text on public key cryptography. Encryption traditionally cares only about the content of a message. Sender, receiver, time of sending, size of the message – all the metainformation about the communication itself is not addressed by encryption.
This ignores the fact that the mere fact of communication occurring between two users might allow an educated guess of the content.
Integrity
Integrity ensures that messages reach their intended receivers unmodified. Any modification done to a message while in transit should be immediately obvious.
This looks like it's already achieved by encryption, because a successful modification requires the knowledge of the encryption key. Without that key, any modification would likely corrupt the message. If the message was text, the damage might be obvious, but if it was binary, the damage might even represent valid content. Another attack would be to simply append an old, intercepted message to a new one. Depending on how the encryption works in detail, this might go unnoticed since the encryption key would be the right one.
To decouple the guarantee of integrity from the applied encryption – the guarantee of confidentiality – actual systems use checksums of the content to detect modifications.
Authenticity
Authenticity is about the certainty that a message actually comes from its alleged sender. Impersonation should be impossible or obvious if it's attempted. More generally, authenticity protects against man-in-the-middle attacks. Or put the other way, man-in-the-middle attacks are basically attempts to subvert the methods of guaranteeing authenticity within a communication system.
This is usually implemented by adding a piece of data to the communication which only the alleged sender can create, but everyone else can validate.
Open source and reproducibility
While open source and reproducibility are not part of the requirements for a secure system, for all practical purposes, it's better to be able to validate the promises of a software vendor than be forced to blindly believe them.
Being open source doesn't, of course, imply that a validation (aka code audit) has taken place, that the software upholds its promises, and is free of bugs. In contrast to closed-source software (which shares the same problems), there is at least the possibility.