Given a real number $a$, the exponential $\exp(ta)$ is the solution to the initial value problem:
$$\displaystyle \begin{cases} \frac{dy}{dt} = ay & \\ y(0) = 1& \end{cases} $$
This definition does not depend very much on the fact that $a$ is a real number. $a$ could perfectly well be a complex number and nothing in the definition would need to change. The only operations involved in the definition are differentiation (i.e. subtraction, division by a real number, and taking a limit) and multiplication, so at least in principle one could use the same definition with any object $a$ for which these operations are defined. For instance, changing the real number $a$ to a matrix $A$ gives the matrix exponential $Y$:
$$\displaystyle \begin{cases} \frac{dY}{dt} = AY & \\ Y(0) = I& \end{cases} $$
All that needed to change is for the initial condition to use the identity matrix instead of the real number 1 (which is the identity element of the real numbers under multiplication). A small, superficial change really.
At this point one might start to suspect a pitfall because matrix multiplication is not commutative. And one might wish to define exponentials for other objects for which multiplication is noncommutative, e.g. quaternions. How do we decide then to write $AY$ on the right-hand side instead of $YA$? Maybe there is a left exponential $L$ and a right exponential $R$ for which:
$$\displaystyle \begin{aligned} \frac{dL}{dt} &= AL \\ \frac{dR}{dt} &= RA \\ I &= L(0) = R(0) \end{aligned} $$
In this post, we will see that $L = R$ so there is no need to distinguish between left and right exponentials.
Proof
Suppose a right exponential $R(t)$ exists in some open interval containing $0$. We can compute the second derivative as:
$$\displaystyle \frac{d^2 R}{dt^2} = \lim_{h\to 0} \frac{R'(t+h) – R'(t)}{h} = \lim_{h\to 0} \frac{AR(t+h) – AR(t)}{h} = AR'(t) = A^2R$$
And so on for the higher derivatives:
$$\displaystyle \frac{d^k R}{dt^k} = A^k R $$
We can form the Maclaurin series for $R$ by evaluating these derivatives at $t=0$, using the fact that $R(0) = I$:
$$\displaystyle R = \sum_{k=0}^\infty \frac{A^k t^k}{k!} $$
This power series is absolutely convergent for all $t$, so termwise differentiation is valid and shows that the series satisfies the initial value problem. We are therefore justified in claiming this series to be the solution $R(t)$. In fact, some texts define the exponential this way to begin with and never mention the initial value problem.
We can do the same process for $L(t)$, yielding:
$$\displaystyle \frac{d^k L}{dt^k} = L A^k $$
But evaluating these derivatives at $t=0$ and using the initial condition gives the same Maclaurin series since $I A^k = A^k I$. So:
$$\displaystyle R(t) = \sum_{k=0}^\infty \frac{A^k t^k}{k!} = L(t) $$
To reiterate, $A$ could be any object for which the necessary operations make sense.
Verification with Quaternions
A quaternion $q = q_0 + \vec{q}$ has an exponential:
$$\displaystyle e^{tq} = e^{tq_0}\left( \cos(w t) + \frac{\vec{q}}{w}\sin(wt) \right)$$
where $w = |\vec{q}|$. A direct calculation which we will omit here confirms that $\frac{d}{dt} e^{tq} = e^{tq} q = q e^{tq}$, so it is both the left and right exponential as expected.
A Note of Caution
Although the left and right exponentials are the same, they need to be used differently in their initial value problems. The left exponential $L$ multiplies the initial condition from the left, whereas $R$ multiplies it from the right.