I asked myself this question straight out of college when I was being plugged into a testing role for my very first project. Fast forward about 7 years later, I still find folks confusing the two. That's why I decided to try separating the two based on what I've been exposed to.
You'll find various job postings, project roles and position titles fluctuating between the two. However, I believe there is a fundamental difference, especially at the level of manual testing. The following comparisons assume a scenario by which a Tester and a Software Quality Assurance Analyst are provided with test cases already designed with test steps developed.
I've come across instances where a Tester has been brought on to do nothing more than simply execute manual test cases. In such a scenario, a Tester would stick to the scripted test steps and not stray from the designated path. Whereas, a Quality Assurance Analyst would leverage that script as an opportunity to gain further insight into the expected functionality of the particular code or component in context. That Software QA Analyst will reference the technical specifications and functional requirements to ensure that there are no discrepancies in the interpretation of the expected results. The QA Analyst would also look for potential off-script anomalies or inconsistencies. A Tester, as expected, would not deviate from the stated test plan. They would fulfill their duty in the most straight forward manner through simply gauging whether each individual test step's actual result matched the expected result.
A Tester would often be rated on their performance based on throughput of test case execution. By sticking to the script, the Tester may show progress on a weekly/iteration basis as depicted in graphs/reports of their throughput achieved. However, the QA Analyst may not necessarily be hitting their weekly/iteration throughput targets, instead, through thorough analysis and venturing off-script, they may have caught potential production issues. The ultimate judgment of quality comes post-production, where minimal impact and seamless implementations are the targeted goal for the QA Analyst.
A Tester's focus: the number of test cases passed. The QA Analyst's goal: identifying defects.
According to the International Software Testing Quality Board (ISTQB): "If the tests are well designed and they pass then we can say that the overall level of risk in the product has been reduced. If any defects are found, rectified, and subsequently successfully tested then we can say that the quality of the software product has increased."
Your HR Department might tag you as a "Tester", the Account Manager of your client engagement may have staffed you as a "Tester", or your Project Manager has you noted down as "Tester" on the Project Organization Chart, but for the sake of the product, put on your "QA Analyst" hat. Keep in mind that when you're in that QA Analyst mindset, be prepared to preach, "Hate me now; love me later". It may be challenging to quantify success in progress reports on a weekly/iteration basis, but don't let that near-sightedness cripple the foresight gained through qualitative progress. The success factor is having to triage less issues in production post-implementation.