Algebraic infinite sum notation

On an infinite-dimensional vector space one usually defines a linear combination as a finite sum; for instance, if K is a basis of the \K-vector space V, one may write, for J some finite part of K:

\pmb v = \sum_{\pmb k \in J} a_{\pmb k} \pmb k

I find this notation cumbersome, because it seems to make the sum dependent on the arbitrary choice of the finite set J, while in fact it doesn’t, provided J contains all the nonzero values to be added.

For instance, if you wish to add two such linear combinations written with different finite sets J and J', you cannot do so directly, without first arbitrarily choosing some other finite subset J'' of K containing J \cup J':

\pmb v + \pmb v' = \sum_{\pmb k \in J} a_{\pmb k} \pmb k + \sum_{\pmb k \in J'} a'_{\pmb k} \pmb k = \sum_{\pmb k \in J''} (a_{\pmb k} + a'_{\pmb k}) \pmb k

while making excuses for the necessity to define the new a_{\pmb k}‘s and a'_{\pmb k}‘s as zero and for the arbitrary choice of J'' which doesn’t change the result.

I find it more practical to accept writing sums of an arbitrary collection of objects, such as:

\pmb v = \sum_{\pmb k \in K} a_{\pmb k} \pmb k

provided we know in advance that only for a finite number of \pmb k are the values a_{\pmb k} \pmb k nonzero.