Yes. Hardware & Software matters. It absolutely matters. It tells you several things.
- What limitations the developers for that system will have. (Ex: The reason you still have loading screens entering small 1-2 room buildings in games like Skyrim is because modern consoles don't have enough RAM to deal with it.)
- What sort of anti-DRM measures might inhibit your ability to customize the device to your liking. (ex: When my family owned a PS3, we had several issues getting it to play files from our media server because the DRM built-in on the box fought with everything that wasn't a .mp3 file. It would sometimes fail to even read file information correctly, so you could open an album folder, and get zero information on the music files beyond their file names. Sometimes, not even file names.)
- What limitations the multiplayer will be put under. (If the system has unreliable multiplayer software, it will slow everything down.)
- What OS the system is running will help determine how much of a pain in the ass it will be to get cross-platform connectivity. If it's a new OS, software developers will have to learn it. (Ergo why operating systems nowadays tend to steal features from each other--makes it easier to code across if you have the same features across every OS.)
- Whether or not backwards compatibility will ever come back. (It won't in any sort of practical capacity. Because greed.)
- How much money the company selling you the box is shaving off in personal profits.
- How many more ways corporations will attempt to steal and sell your personal information for various ends. (If you think Microsoft smartened up after its creepy always-on built-in webcam shit was shouted against, you're a fool. Both Sony & Microsoft constantly monitor your network activity. For... "Stability reasons." Cough.)
- How many limitations the system has that will require peripherals. (Ever wondered why Nintendo boxes tended to be so cheap and minimal? Because they would sell you a dozen peripherals that got them their money back.) Fun fact: This is the same reason new consoles don't come built-in with features they've been toting in previous generations for a while now, like motion controls, and web cams. At least they stopped with the rumble pack nonsense.
For example: The PS3 came out boasting a humongous amount of graphical power, but it has an absolutely fucking terrible amount of RAM. That's why most games on it have tons of loading screens. The PS4 didn't do much in the way of upgrading graphical fidelity, but it has a ton of RAM now. And now, a games like No Man's Sky are touting about procedural generation, and gameplay videos show no loading screens... Hmm... HMM... Couldn't be tied to hardware, no siree.
Now, gameplay is always what matters. So, keeping that in mind, the more power any system has (and the better rounded that power is), the more options developers will have in choosing and designing SDK's to add features you might not have been able to get on previous generation hardware. I love X-COM Apocalypse, but I'm not going to pretend that the graphics aged well, and the limitations the game had with its interface were directly a result of the hardware of the times. The newer X-COM's, simplified though they might be, play a
lot smoother and have a more rationally designed interface. I might enjoy old first person shooters like DOOM, but I'm not going to pretend that the graphical fidelity one can find in Metro 2033 is unwelcome at all. Hell, the atmosphere that game projects onto the player is
only possible because of the graphical fidelity it is able to use to create that atmosphere.
Now, do all games need millions of dollars of graphics? No. Aesthetic is the thing people remember and aesthetic design is what makes something nice on the eyes even if it's designed in DOS. BUT! Better hardware & software means the developers have more power and more options to develop
what they want to make rather than
what they're forced to make due to limitations. Especially software. The PS3 was apparently a massive pain in the ass to design games on because of its unorthodox software architecture and frankly, considering I had to use the damn thing, yes: It is a software nightmare.
EDIT
As for gameplay vs story, depends on the game and the genre obviously. Arguing one is objectively better than the other is foolish, and arguing over personal tastes even more-so. It's why we have so many genres, everybody enjoys their own thing.