Let $M$ be an $A$-module and let $\mathfrak{a}$ be an ideal of $A$. Suppose that $\mathfrak{m} \cdot M =0$ for every maximal ideal $\mathfrak{m}$ of $A$ such that $\mathfrak{a} \subseteq \mathfrak{m}$. Is $M$ the trivial module?
Here $\mathfrak{m} \cdot M = \{\sum_{finite} a_{i} x_{i}: a_{i} \in \mathfrak{m}, x_{i} \in M \}$
If we actually replace the condition by $M_{\mathfrak{m}}=0$ (localization of $M$ in $\mathfrak{m}$) then we can conclude that $M=\mathfrak{a}M$, this is an exercise in Atiyah's and Macdonald's book. What happens if we replace the localization condition by the condition $\mathfrak{m} \cdot M =0 $ ?