To refine my above post and bring it closer to the point, here's another thought or two.
The only place where natural languages really allow creation of components is the creation of new words. Some languages are more open to neologism than others (English is extremely open, for instance), but it's universally more acceptable to make up a word that "sounds right" than to, say, make up a new word order. (Even do that, you may, but be easily understood, you will not.) The thing is there are classes of words that are open and classes that are closed. Particles like "he" and "the" and "to" are generally parts of very closed classes, where the closes you can come to neologism is an accent or using the wrong version of a particle (like "him" instead of "he"), and still these are changes that are generally part of a dialect. The thing is that these "words" are more structure than they are content. Even these are an arbitrary conjunction of form and meaning (after all, "he" carries precisely the same meaning as German er, Russian on and Hebrew hu) but one may imagine a language where there are no open classes of words; a language where you have a very large closed class of forms bound to abstract meanings, and the only way to express meanings not already bound to an existing item is to put together existing items in a way that creates the meaning you want. If this language has extremely intricate rules for how you put together items to create new meanings, what you have is essentially a language of pure structure, only rules, with no incoming chaos. But such a language would inevitable be severely limited in transmitting information, because the number of possible propositions would be something like (number of items) to the power of (number of possible ways to conjoin items), and there is a limit for how much the human brain can take of one or the other.
Taken to an absolute extreme, where no meaning is arbitrary, you would have a tiny class of onomatopoeia (or iconic hand signs) which can only express a very very limited range of possible messages. On the other end of the spectrum we have the context-free nonsense where there are no closed classes of lexical items and there are no rules on how to put items together, so you're always inventing a new stream of sounds to express the specific meaning you want to transmit. Both ends of this spectrum are severely limited for the transfer of information. What you get in natural languages is a balance, where there is a lot of arbitrarily assigned meaning, a fair amount of space for random creation of new content (neologism, idioms), and a restrictive but not asphyxiating amount of constraints that tell you how to interpret the juxtaposition of elements. As far as expressive power, tipping the scale slightly towards creativity, as is common in poetry, gives you more expression (but less precision) and tipping it slightly the other way, as in legalese, gives you a far narrower range of expression, but maximum precision (=minimum ambiguity).