Largest Numerator of Sum of Egyptian Fractions
To answer this question, I'm going to use some ideas that will also be present in a (hopefully) forthcoming paper of mine on somewhat related questions on harmonic sums. I will make repeated (ab)use of the prime number theorem and little-o notation, $p$ always denotes a prime number, and if anything is unclear, please feel free to ask.
The idea is that we initially choose $c(k) = 1$ for all $k$, and then change some $c(k)$ to $0$ if it turns out that the denominator of $\displaystyle \sum_{k=1}^n \frac{c(k)}{k}$ does not equal $L_n := lcm(1, \ldots, n)$. The end result is that we can get quite close to your upper bound. More precisely, we have the following theorem.
Theorem. It is possible to choose $c(k) \in \{0,1\}$ (for $1 \le k \le n$) such that the numerator of $\displaystyle \sum_{k=1}^n \frac{c(k)}{k}$ is bigger than $L_n \left(H_n - \frac{1+o(1))}{\log^2(n)}\right)$.
Proof. Define $X_n = L_n \displaystyle \sum_{k=1}^n \frac{c(k)}{k}$. Then we want $\gcd(X_n, L_n) = 1$ (so that $L_n$ truly is the denominator of the sum) and $X_n$ as big as possible. To ensure $\gcd(X_n, L_n) = 1$, we need that $X_n$ is not divisible by a prime $p \le n$. So let $p \le n$ be a prime and assume for now for simplicity that $p > \sqrt{n}$. Then, after initially choosing $c(k) = 1$ for all $k$, let us look at $X_n \pmod{p}$, by noting that $\frac{L_n}{k} \neq 0 \pmod{p}$ precisely when $p|k$;
\begin{align*} X_n &= \displaystyle \sum_{k=1}^n \frac{L_n}{k} \\ &\equiv \frac{L_n}{p} + \frac{L_n}{2p} + \ldots + \frac{L_n}{lp} \pmod{p} \end{align*}
with $l = \left \lfloor \frac{n}{p} \right \rfloor$.
If we are unlucky then the above sum is $0 \pmod{p}$. But then we just choose $c(lp) = 0$ to change the value of $X_n \pmod{p}$ to something non-zero.
In general (when $p$ is possibly smaller than $\sqrt{n}$) let $p^m$ be the largest power of $p$ with $p^m \le n$. Then $\frac{L_n}{k} \neq 0 \pmod{p}$ precisely when $k$ divides $p^m$. And with $l = \left \lfloor \frac{n}{p^m} \right \rfloor$, if $X_n \equiv 0 \pmod{p}$ with all $c(k) = 1$, then simply choose $c(lp^m) = 0$ and we are fine again.
Another thing to note is that, with $c(lp) = 1$ for all $l$, when $p \ge \frac{(1+o(1))n}{\log(n)}$, the inequality $X_n \pmod{p} \neq 0 \pmod{p}$ is guaranteed by the fact that in that case $X_n \equiv \dfrac{L_n}{p} \displaystyle \sum_{k=1}^l \frac{1}{k} \equiv 0\pmod{p}$ if and only if the numerator of $\displaystyle \sum_{k=1}^l \frac{1}{k}$ is divisible by $p$. But this numerator is smaller than $L_lH_l < e^{(1+o(1))l} \le p$, so definitely not divisible by $p$. This means that if we have to change $c(lp)$ to $0$ for some $p$ with $\sqrt{n} < p \le n$, then $p < \frac{(1+o(1))n}{\log(n)}$ and from this it follows $lp > n - \frac{(1+o(1))n}{\log(n)}$.
In conclusion we may say that we can simply choose $c(k) = 1$ for all $k$, except for some $k$ of the form $k = lp^m$ where we may have to choose $c(k) = 0$. So either $p \le \sqrt{n}$ (in which case $k > \frac{n}{2}$) or $\sqrt{n} < p < \frac{(1+o(1))n}{\log(n)}$ and $k > n - \frac{(1+o(1))n}{\log(n)}$. Let us denote $\left \lfloor \frac{(1+o(1))n}{\log(n)} \right \rfloor$ by $x$.
We can now calculate a lower bound on $X_n$.
\begin{align*} X_n &= L_n \sum_{k=1}^{n} \frac{c(k)}{k} \\ &\ge L_n \left(\sum_{k=1}^{n} \frac{1}{k} - \sum_{k=n-x+1}^{n-x+\pi(x)} \frac{1}{k} - \sum_{k = \left \lfloor \frac{1}{2}n \right \rfloor + 1}^{\left \lfloor \frac{1}{2}n \right \rfloor + \pi(\sqrt{n})} \frac{1}{k}\right) \\ &> L_n \left(\sum_{k=1}^{n} \frac{1}{k} - \frac{\pi(x)}{n-x} - \frac{2\pi(\sqrt{n})}{n}\right) \\ &= L_n\left(H_n - \frac{1+o(1)}{\log^2(n)}\right) \end{align*}
And the proof is finished.
If one is more interested in explicit bounds, one can without too much trouble use explicit estimates on prime counting functions to prove, for example, $X_n > L_n \left(H_n - \frac{4}{\log^2(n)} \right)$ for all $n \ge 2$.
In the other direction, looking at $X_n \pmod{3}$, for $n = 3^{m+1}-1$ one needs $c(3^{m})$ or $c(2\cdot3^{m})$ to be $0$ (or $-1$), or else have $X_n \le \frac{1}{3}L_nH_n$. So $X_n \le L_n \left(H_n - \frac{3}{2(n+1)} \right)$ infinitely often. With a bit of work one can show a very modest improvement to $X_n < L_n \left(H_n - \frac{2}{n}\right)$ infinitely often, by finding infinitely many $n$ such that there exist $m_1, m_2 \in \mathbb{N}$, for which simultaneously $2\cdot3^{m_1} \le n < 3^{m_1+1}$ and $4\cdot5^{m_2} \le n < 5^{m_2+1}$. No doubt for every $c > 0$ there are infinitely many $n$ such that $X_n < L_n \left(H_n - \frac{c}{n}\right)$ holds, and maybe some pigeonhole type argument could give us that, but as of now I do not see how.