I have been coding algorithms for 15 years now and it is only since ~5 years back that I'm confident stating that I understand how to analyse runtime complexity.
Along the way, I have done all sorts of things to try to get better:
Tried reading algorithms books (yes, I said tried, never completed one)
Tried understanding it from online tutorials
Tried revisiting the notes I took from algorithms subject in university
None of them really helped me much until one day I was trying to explain it to someone else and I came up with this:
For the example we'll use this one: "On how many pages of Lord of the Rings does Gollum appear?"
1: for page in book.pages:
2: for word in page.words:
3: if word is "Gollum":
4: appearances += 1
5: continueFirst thing to have in mind. The algorithm is not magic, you coded it so you understand it and there's nothing the computer is magically doing, just running the instructions one after the other
So what's the algorithm doing? For every page in the book we get every word and compare it with "Gollum". If it matches then we increase the counter and move on to next page
So the worst case of is we need to read all words of all pages once (Gollum doesn't appear or appears last word in the page).
So we can conclude that the time it takes for our algorithm to complete directly depends on the number of words of the book (pages are irrelevant!).
At this point you already understand how your algorithm behaves! Now you could go one step further and present this result is using a standard mathematical notation known as Theta or Big O notation.
Assuming it takes us a constant time (k) to read a word (regardless of the size) and our book has N pages then we can say that the time to read the book(Θ) = k * N and, getting ourselves in worst case again (huge N) we simply ignore k as we assume it irrelevant as compared to N, so the theta notation for our algorithm is Θ(N)