Yes it has only one background layer available. The effect is achieved by scrolling different parts of the image at different speeds. But they are not really different layers and therefore cannot overlap.
@@yasminesteinbauer8565 Yeah, and they've obviously used some spare sprites at certain points to create the illusion of some parts of the background overlapping with others in some of the levels, just here and there, which is a neat effect and solution. Very cool.
I'm surprised that this thing only had an 8-bit CPU, it's a true testament to its 16-bit graphics subsystem. Reminds me of going from a S3 ViRGE accelerator to my first 3DFX card - the microprocessor stayed the same, but the framerates went through the roof. Truly epic. I'm 50 now, and saw games grow up from the Atari 2600 days. I had a TG16 back in the 90s, which shipped with Keith Courage in the Alpha Zone. I bet if it would have shipped with a Bonk game, then the TG16 would have actually had a chance to compete with the likes of Sonic and Mario.
3D acceleration was a totally different beast and did take most of the load off the CPU. Wasn't quite like that in the 4th console gen. The CPU still did a ton of work back then, from aspects of the graphics such as animating the sprites and tiles, to physics and audio. It's just that the term "bits" was a silly way to judge system performance even then.
I wonder if Konami has the Supergrafx prototype peripheral.The Power Console the unreleased cockpit-sized controller that attaches onto the Supergrafx unit.
I always wondered why they didn't release a true 16 bit cpu to compete with the Sega genesis in the United States.You know they probably had one and decide not to use it.Instead of making all these different versions of the pc engine upgrade the cpu. They could of use Motorola 68000 processor cpu same as the Sega genesis.
Cost is usually a factor - perhaps the CPU was cheaper, and NEC reckoned the GPU subsystem was sufficient to deliver "16-bit quality" visuals, but with a reduced main processor. I think the PC Engine was perhaps envisioned to one-up the NES, not take on the Genesis, hence the design choices they made?
@@AL82RetrogamingLongplays They knew Sega genesis was coming you would think they would change the components to match the genesis instead of adding turbo switches to the controller. That's their upgrade lol.
Because putting a 68k or 16bit CPU in the PC-Engine would do nothing for it. It literally has the speed of the 68k (the huc6280 might be 8bit, but it's incredibly fast.. easily matching the original 68k at the same clock speed). I'm not sure what you think a 16bit the CPU is going to do. Literally, all the heaving lifting of these 4th gen consoles comes from the graphic chips - not the CPU. The cpu has nothing to do with flicker/sprite drop out, parallax, colors, etc. Famicom developers cut their teeth on the lesser 6502 in the Famicom system, and so coming over to the PCE with its more capable 65x compatible cpu, enabled quick development ramp up and use of familiar tools. It seems silly, but that was an advantage early on in the PCE's life. The processor was also cost reduction because they fab the cpu themselves - it's not a stock processor and has additional block transfer (DMA), builtin MMU for 2megabyte address range, and more instructions. If they went with a 68k, they would have to build that tech into additional support chips (which is what Sega had to do). That's an advantages of custom over stock parts. Nintendo did this with their 6502 compatible processor in the Famicom, and the same for the Super Famicom - both those system don't use stock CPUs but rather custom ones with additional features built in.
@@orlandoturbo6431 The PCE was designed with a FULL upgrade path via the expansion bus. That expansion bus it more advance than any console before it, and after it. And the SuperGrafx was the answer to the Megadrive. It was more powerful than the Megadrive, but clearly wasn't needed. They could have made the SGX an addon, and it would have been pretty simple (nothing like the 32x) thanks to that expansion bus. Even with the advantages of the MD over the PCE, the PCE over shadowed the MD in Japan. NEC didn't care about the market outside of Japan, so Japanese market dictated software and hardware related to the PCE.
The M68000 was a pricey beast, powering most of the era's pricey computers and arcade boards. Sega got a hell of bulk deal on them and still had to cut other things from the hardware to cover the high cost. The TG16 was a bit underpowered in the CPU department, making it struggle with things like less-than-basics physics calculations (like complex hit detection), which could've been helped by throwing a cheap co-processor like the Z80 in it (to offload the audio and some other stuff), but TG16's hardware's biggest issue was the graphics hardware which could only output a single scrolling background layer. By that time, multi-layer backgrounds and the cool depth effects they enabled, became a staple of arcade games. A lot of TG16 games tended to looked "flat" in comparison to those arcade games... or the Genesis and SNES ones. Developers came up with ways to simulate these effects using software tricks, with varying degrees of success (just check out the scrolling on Ninja Gaiden for the TG16!), but that stuff took time and resources to implement. This games obviously has excellent layered background depth effects, which are definitely the "star of the show" for this otherwise pretty average shmup.
Well, that’s the aspect ratio the PCE delivers in this resolution. The problem here is the emulation in combination with modern screens that produce real square pixels vs. horizontal rectangle pixels on a CRT.
Perfect explanation - the nature of LCD and emulation means it's a choice between square pixels or correct aspect ratio. 4:3 aspect ratio produces pixels that aren't square, which look worse.
@@AL82RetrogamingLongplays I am fully aware of the technical background. Therefore, it would be best to use a superior upscaling algorithm than simple nearest neighbor. Pixels back then were not only not square - the electron beam was not even rectangular but round and blurry. Anyway, the way it is seen here, the game was not intended by the developers and dithering does not work either. However, I would prefer even nearest neighbor with correct aspect ratio.
Lanczos, bicubic and other upscale filters produce smeary effect that, in my opinion, are worse in visual appearance than nearest neighbour; it's like watching the video through baking parchment. I've also experimented with various CRT filters and scanlines, but the problem with those is they require increased screen size (over 1080p) to really see the effect, and it produces a nasty screen-door effect for people watching non-maximized videos, or watching on a mobile device with a smaller screen. As for the aspect ratio/nearest neighbour, non-square pixels are more of a distraction to my eye, hence why the videos are presented in the way they are.
@@AL82RetrogamingLongplays I don't know what options you have, but something like Gimps lohalo (sigmoidized elliptical weighted averaging with Robidoux bicubic for upsampling, blended with non-sigmoidized EWA Robidoux for downsampling) gives quite good results. (instagram.com/p/CO1q0j1nk7U/ ) And if you want to stay at nearest neighbor, you only have to choose a multiple of the pixels. So for example one pixel equals 8 pixels horizontally and 7 pixels vertically at the highest resolution you upload. And if the video is displayed in smaller resolutions, the downsampling works as oversampling and should still look good. But I agree about scanlines. That's not a good idea if different resolutions are used for viewing.
OK, seriously, I despise when the player doesn't stick with a power-up and just keeps changing weapon type even when it makes sense not to. Less than 3 minutes in, have to walk away from this.