The impact of the finite grid size in SOLPS-ITER edge plasma simulations is assessed for JET H-mode discharges with a metal wall. For a semi-horizontal divertor configuration it is shown that the separatrix density is at least 30% higher when a narrow scrape-off layer (SOL) grid width is chosen in SOLPS-ITER compared to the case for which the SOL grid width is maximised. The density increase is caused by kinetic neutrals being not confined inside the divertor region because of the reduced extent of the plasma grid. In this case, an enhanced level of reflections of energetic neutrals at the low-field side (LFS) metal divertor wall is observed. This leads to a shift of the ionisation source further upstream which must be accounted for as a numerical artefact. An overestimate in the cooling at the divertor entrance is observed in this case, identified by a reduced heat flux decay parameters lambda(div)(q). Otherwise and further upstream the mid-plane heat decay length lambda(q) parameter is not affected by any change in divertor dissipation. This confirms the assumptions made for the ITER divertor design studies, i.e. that lambda(q) upstream is essentially set by the assumptions for the ratio radial to parallel heat conductivity. It is also shown that even for attached conditions the decay length relations lambda(ne)>lambda(Te)>lambda(q) hold in the near-SOL upstream. Thus for interpretative edge plasma simulations one must take the (experimental) value of lambda(ne) into account, rather than lambda(q), as the former actually defines the required minimum upstream SOL grid extent.