There is often a fundamental mismatch between programmable privacy frameworks, on the one hand, and the ever shifting privacy expectations of computer system users, on the other hand. Based on the theory of contextual integrity (CI), our paper addresses this problem by proposing a privacy framework that translates users' privacy expectations (norms) into a set of actionable privacy rules that are rooted in the language of CI. These norms are then encoded using Datalog logic specification to develop an information system that is able to verify whether information flows are appropriate and the privacy of users thus preserved. A particular benefit of our framework is that it can automatically adapt as users' privacy expectations evolve over time. To evaluate our proposed framework, we conducted an extensive survey involving more than 450 participants and 1400 questions to derive a set of privacy norms in the educational context. Based on the crowdsourced responses, we demonstrate that our framework can derive a compact Datalog encoding of the privacy norms which can in principle be directly used for enforcing privacy of information flows within this context. In addition, our framework can automatically detect logical inconsistencies between individual users' privacy expectations and the derived privacy logic.