Is it mandatory to specify integration variables before taking the integral?

The usual advice at this stage is to treat expressions like $y = x\ dx$ as notational shorthand, without worrying too much about the actual underlying meaning. You can still rigorously make sense of separation of variables without the use of differentials as follows:

Consider the ODE,

$$ f(y)\ \frac{dy}{dx} = g(x).$$

We then assume there is a function $y = y(x)$ which satisfies the above relation. Integrating both sides with respect to $x,$ we get,

$$ \int f(y(x)) \frac{dy}{dx}\ dx = \int g(x)\ dx. $$

Now we can perform a change of variables in the first integral to get,

$$ \int f(y)\ dy = \int g(x)\ dx. $$

This is how think of the separation of variables method - we start with an equality in $x,$ then we integrate along this variable.

Of course this raises the question of how to justify the change of variables, but that's a different question altogether. The short answer is that you can't rigorously justify it without properly defining what an integral is, which is done in analysis.


The more intuitive idea in terms of infinitesimal differences however, is worth keeping in mind also.

The intuition behind integrating objects like $g(x)\ dx$ is that we sum them. Since $dx$ intuitively represents some small change in the variable $x,$ what we are doing is summing the small changes at each point. If we are integrating over $[0,1]$ and we have $x_i$'s such that, $$ 0 = x_0 < x_1 < \dots < x_n = 1,$$

then with enough points we intuitively expect,

$$ \int_0^1 g(x)\ dx \simeq \sum_{i=1}^n g(x_i) (x_i - x_{i-1}). $$

So in a way we approximate $dx \simeq x_i - x_{i-1}.$ In fact later on you will see we actually define the integral as some kind of limit of the above sum. The takeaway for the time being however, is that differentials give 'small differences' and integrating over these differentials means 'summing these small differences.'

Returning to the equation $f(y)\frac{dy}{dx}=g(x)$, what we do here is again suppose we have a function $y=y(x)$ such that the ODE holds. We also assume that this relation between $x$ and $y$ is invertible, in that we can also write $x=x(y).$ Note that we implicitly assumed this in the above manipulation, when performing a change of variables.

If this holds, then we note that we can make the approximation,

$$ g(y_1) (y_2-y_1) \simeq f(x_1) (x_2-x_1)$$

where $y_1 = y(x_1)$ and $y_2 = y(x_2).$ This holds if $x_1,x_2$ are sufficiently close together and is a consequence of the the linear approximation property of the derivative which asserts that,

$$ \frac{dy}{dx}(x_1) \simeq \frac{y_2-y_1}{x_2-x_1}. $$

This is the defining intuitive idea behind the derivative, though the formal definition you may encounter later on is a bit different.

So when we write $g(y)\ dy = f(x)\ dx,$ we can interpret it to mean the approximation property above holds.

The idea now is to sum these differentials as discussed above. So informally we get,

$$ \int g(x) dx \simeq \sum g(x) \Delta x \simeq \sum f(y) \Delta y \simeq \int f(y) dy. $$

Now this manipulation is obviously not very rigorous and is missing a number of details. You can see however that summing the $dx$'s on one side results in summing over $dy$ on the other, because the relation between $x$ and $y.$


The question "integrate $2x$" is a shorthand way of asking the following concrete question:

Find an anti-derivative of the function $f:\mathbb{R} \to \mathbb{R} $ given by $f(x) =2x$.

Or more precisely

Let $f:\mathbb{R}\to\mathbb{R} $ be defined by $f(x) =2x$. Find a function $g:\mathbb{R} \to\mathbb{R} $ such that $g'(x) = f(x) $ for all $x\in\mathbb{R} $.

So in effect you need to find a function with some specific properties. The variable $x$ does not come into picture. It is used only as a means to specify the function via a formula. The notation $\int f(x) \, dx$ is just a notation for the anti-derivative operation. The $\int$ and $dx$ are used together and the variable $x$ in $dx$ is used to match the variable $x$ which is used to specify the function $f$.

But the notation using $dx$ has its own advantages (in particular it helps us to remember the technique of substitution and formula for integration by parts) and hence used commonly. For derivatives we are lucky to have a notation $f'(x) $ as an alternative to $\dfrac{d} {dx} f(x) $ but we don't have corresponding notation for integral which avoids the $dx$.

Later when you study definite integrals and their theories you will find the $dx$ and $x$ being dropped and instead we use the simpler notation $\int_{a} ^{b} f$.


Another meaning that nobody has mentioned is that, if we take the geometrical interpretation, then $$\int_a ^b \! f(x) \, \mathrm d x$$ is the area between $f(x)$ and the $x$ axis, from $x=a$ to $x=b$.

You could think of it as dividing that area in tiny rectangles of height $f(x)$ and width $\mathrm d x$ and adding (integrating) from $x=a$ to $x=b$.

When you integrate, you consider $\mathrm d x$ to be an infinitesimal, so you divide the interval $[a,b]$ in infinite parts and sum for infinite tiny rectangles of heigth $f(x)$ and width $\mathrm d x$.

Then, geometrically, $\mathrm d x$ is the width of the tiny rectangles you are adding.

Edit: Also, the notation $\frac{\mathrm d y}{\mathrm d x}$ is the infinitesimal counterpart of the ratio $\frac{\Delta y}{\Delta x} = \frac{y_1-y_0}{x_1-x_0}$. In the latter you are taking intervals $[x_0, x_1]$ and $[y_0, y_1]$ and computing how $y$ changes respect $x$ in those intervals. So $\frac{\mathrm d y}{\mathrm d x}$ is the same concept, but with tiny (infinitesimal) intervals.

Edit 2: I wrote this answer after reading all the other answers, but forgot that the initial question was if there is mandatory to write $\mathrm d x$.

It depends. For example, if you have a function $f(x,y)$ and you want to define another function $F(x,y)=\int \! f(x,y) \, \mathrm d x$, it is important to specify the variable, as $G(x,y)=\int \! f(x,y) \, \mathrm d y$ is another totally different function.

But there are situations where the integration variable can be ommited for simplicity, like when you have a cycle $\Gamma = \gamma \cup \phi \cup \psi$ and you want to state that $$\int_\Gamma \! f = \int_\gamma \! f + \int_\phi \! f + \int_\psi \!f$$ Here you are talking about relations of integrals and not about a specific variable, which you will write when you have to compute something like $$\int_\gamma \! f(z) \, \mathrm d z= \int _a ^b \! f(\gamma(t)) \gamma '(t) \, \mathrm d t$$