My experience with automation is that it's invaluable for regression, particularly the kinds of regression that are tedious and painstaking to perform. A login script is usually a utility that happens as part of a larger script suite - which must, as Siva said, be object-oriented and data-driven if you don't want to create yourself a maintenance nightmare.
Phil is 100% correct - you don't want to simply code your manual tests. Some types of test are better handled manually, others make more sense to automate.
My suggestions are:
- look to the 80/20 rule - good automation targets involve the 20% of your application that's used by 80% of your users 80% of the time. My last employer, the application had massive amounts of setup, which was largely once and done, and a relatively small transaction component where almost all the activity occurred. Regression focused on the transactions.
- each component identifier should appear exactly once in your code, preferably as a defined constant or an object mapping. That way, when the component changes, you're only updating your automation in one place and you know which place you're updating. (My current position, I'm storing it in a database and have built a crude web app to edit the database).
- look at ROI - not so much $$$ but tester time and business value. If you've got a set of manual tests that take days or hours to complete, and they'll hurt your employer's reputation if a regression escapes, they're a good target for automation. My last employer, this was tax calculations in transactions. Bare-bones manual regression of that was 3- 5 testers for a week of tedium with a high probability of errors. Once the automation was in place for it and the misery of validating the baselines completed, a more comprehensive set of tests run every night (on three machines, for a total run-time somewhere over 24 hours), so that anything affecting tax calculations were caught within a day of it being introduced.
- keep aggressive maintenance and refactoring program - My experience is that aggressively maintaining and improving your automation code is the only way to keep it from becoming a headache. My last employer, some of the automation was over 10 years old and I was the only person there who understood how it worked because there wasn't enough of a priority on maintenance.
- automation is code - You'd be surprised how many people don't get this. You see tools advertising code-free automation - what they mean is that they code it and limit your ability to build scripts to the things they've coded. Even using these tools you still need to keep your tests granular and reusable or you'll end up with a mess.
- do not record/playback - Even the most advanced record/playback tools can't do the things well-constructed code can do.
- automate the tedious tasks - I can't stress this enough. If you're doing something boring multiple times over, it's a good candidate for automation.
- don't forget the utilities - Utility automation can get overlooked in favor of automated regression, but it's just as useful. Automating a lengthy application install process saves everyone time and effort. Automating configuring standard data sets is another good one to consider. Automated database backup/restoration, bulk file copy operations, virtual machine setups and the like are huge time-savers. Automating the test reporting is another good one to work with - instead of using the tool's result reporting, export the test logs to a central repository in a format that everyone can read. Feed them into a dashboard your managers can use.
- if it repeats, automate it - with obvious exceptions like checking printed output (which is faster and more accurate using the mach-1 eyeball), if a defect slips to production more than once, consider automation to test for it.
Hopefully, these will give you some ideas to start with.