Speaking of abstract vs. concrete, I strongly recommend
Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.
It starts off simple with switches and explains how switches can be combined into relays (essentially mechanical versions of transitors); how relays can be combined into logic gates (AND, OR, XOR, etc.); how logic gates can be combined into circuits that can add, subtract, multiply and divide (ALUs) and circuits that store information (RAM); and how these components can be combined into a simple computer with switches for inputs and light bulbs for outputs. Along the way, he also touches on Morse code, Braille, barcodes, machine language, 86x assembly language, high level languages, ASCII, simple operating systems, and computer graphics, all within 400 pages.
It's simply one of the best books on any subject that I have read in a while. And for me it was really helpful to start from the bottom and build up each layer of complexity. Modern computers so often just seem to work like magic, so it can be a little mind-blowing to see how everything they do results from incredibly simple operations being performed over and over again according to specific patterns and at incredible speeds.
This may have been easier to grasp with older computers when you typed a command and then had time to think about what was going on inside the computer while it calculated the output. I remember my dad showing me how to write a simple BASIC program on his monochrome luggable IBM-compatible in the mid 80s. Those machines seem much simpler than the ones we use today, although they are still exceedingly complicated.
At any rate, gaining some understanding of how the complex behavior of computers emerges from layer upon layer of simple rules and instructions makes it easier to think about how the same thing can happen with neurons. (I also think it was helpful to read this book before trying to learn C.)