Suppose that X is a real-valued random variable on the probability space $(\mathrm{\Omega},\mathcal{F},\mathbb{P})$ with a cumulative distribution function ${F}_{X}(x)=\mathbb{P}[X\le x]$. Can we conclude from some measure theoretic property that $\mathbb{P}[X\le x]=\mathbb{P}[X<x]$? The measure zero of singleton points is certainly true for many well-known measures, but can we conclude that in general, subtracting a finite number of points from $(-\mathrm{\infty},x]$ does not change the probability of $(-\mathrm{\infty},x]$?

I'm asking this because my reading material hasn't explicitly taken care of this, and from other courses I know that $\mathbb{P}[-x\le X\le x]={F}_{X}(x)-{F}_{X}(-x)$, but arguing only with the general properties of probability measures I know of, yields $\mathbb{P}[-x\le X\le x]=\mathbb{P}[X\le x]-\mathbb{P}[X<-x]$, where the RHS would simplify to include only the CDF of X, if the finite difference of points doesn't matter.