This is getting really popular. I saw a regular 21" consumer Trinitron on Tokyo's Craigslist for 5000 yen I guess just because it had RGB and YPbPr. It won't be long before the rising value of CRTs combined with the rising quality of consumer HDTVs will make it largely a matter of nostalgia - but we're not there yet! Get your pro monitors while you can!
Is there any saving a dropped CRT?
Colors are all out of whack.
I turned it on its side to see how it would react. It fixed the colors for a few minutes but as it sat the colors slowly drifted and soon it was messed up again.
So the monitor that Carmac is using in that one picture and the Sony FW900 (and its rebrands) are the only CRT displays to support 1080p resolutions, right? I know pretty much all CRT HDTVs only go up to 1080i.
No, there are many CRT monitors that support 1080p and higher. My Samsung Syncmaster 955b for example can go as high as 1920*1440, my Tulip monitor even more, and both on Progressive scan.
There's a whole bunch of high-end CRT that support higher than 1080p res. Such as the non-F W900. Nokia also has a few high-res monitors.
Even slightly higher-end 21 inch or more CRT from late 90's to early 2000's support 1920x1440 or higher.
Most computer monitors from the early 2000s support higher resolutions than 1080p.
LCDs were a huge step back in terms of resolution. Monitor resolution has been stagnant because of the rise of HDTV.
All High-End 21 inch Samsung from early 2000's
All High-End 21 inch Nokia from early 2000's
All High-End 21 inch Sony from early 2000's
All High-End 21 inch Nec from early 2000's
You do realize that's a lot of models right? And that's not even a quarter of the manufacturers who made high-res CRT monitors.
H-here I go again, this guy >>1788902 from previous thread
I got some help from /g/ and this might very well be a lost cause, but on the off chance that someone has any idea what I'm doing wrong I'll post it in these threads for the last time
TL;DR I'm trying to hook up a Macintosh Color Display to my main pc through DVI using DA-15 to VGA and VGA to DVI connectors, the screen keeps scrolling up like the frames on the roll of a film and supposedly this is cause by incorrect refresh rate, but even trying the right refresh rate (66.7Hz) and resolution (640x480) through custom resolution in the nVidia control panel causes the scrolling. Strangely enough it works perfectly on my g3 mac. I've tried every refresh rate from 60hz to 85hz to no avail. I found pic related on the back of the monitor looking for vertical hold knob or similar but don't know what theyre for and was advised not to mess with them.
Any help appreciated. Unless I figure it out I'll just hook it up to the mac, I'd just rather use the pc for convenience and because more disk space
It's a sync issue, I can tell you that much.
Out of curiosity, what kinda sync does the mac monitor eat? Cus if it's composite sync and it's getting horizontal or something, that might cause such issues.
the picture shows POTs (potentiometers, used for adjustments)
left is probably h-position
middle i'm not sure
right is skew
shouldn't be a need to touch those
when the picture looks like it's in the right place, does it jolt a little as if it's trying to keep a still picture, or does it scroll smoothly, at a constant rate?
Yes but what are the horizontal scan rates on them? Would any tube ever possibly do 4K res, or even 4Ki?
If the monitor works on a G3 Mac (I'm assuming you used its native god-forsaken cable plug) then it can receive a correct signal. The issue must lie in the conversion between your double adapter setup. You know it's never good to convert more times than you have, to, right? Who knows how badly implemented the converters you're using are, or which of them is the problem.
>what are the horizontal scan rates on them
depends on the monitor. the most common is 85hz for 1920x1440. though some only support 60hz at 1920x1440. and some do higher.
>Would any tube ever possibly do 4K res
none that are widespread afaik.
yeah theyre really poorly drawn so they probably dont even make sense hehe. Thanks for clearing it up though
depending on the hz setting the screen is either too much to the left or right, it stays in position though and doesnt jolt. It usually scrolls at the same pace from what Ive seen but at the "best" setting (77hz) it scrolls really slowly and at least once stopped albeit "between frames" like it was a roll of film, it was still flickering a lot though. I've also seen it increase in rolling speed from slow in that setting
yeah I worried about that but at the same time I have to use DVI as the mobo VGA is disabled with my gpu running. I suppose I could look and see if I have another adapter or something
I have no idea sorry. Is there any way to check this?
Would love to get some Mitsubishi stuffs.
It's just that my apartment is hilariously tiny.
>I've also seen it increase in rolling speed
from how it sounds, this line especially, the monitor might not be getting a vsync signal at all
try cleaning out your adapters, screw them in if they have screws, wiggle them a bit to see if the image gets better/worse momentarily
try other adapters/cables if you can
I just think that with all the pin mismatch between two converters, a critical signal pin accidentally gets cropped out at some point by converter manufacturers that don't know how to do it right. Either way latency must suck if the converters ever resort to digital reprocessing.
Not really /vr/, but you guys are the only place I know to ask this.
I bought a big ass 4:3 Trinitron a while back, and I was wondering about the 16:9 enhanced setting. Can someone explain what this really does , and should I be using it for games that would play letterbox regardless of what screen I have?
CRTs have a broader support of resolutions than LCDs with native resolutions. It's more common than you think when you realize that.
In any case: I'm leaving this here.
I had a 16:10 Samsung HT monitor I gave to my brother and I got a 1080p one instead. Reason: Better horizontal FOV in the FPS I played. Good to know when a spy is coming in towards you from the side. K, that's it for non-/vr/ness.
1080p requires 1920 horizontal. 1600x1200 has more total pixels, but still can't display it correctly.
But how many of those 4:3 monitors can accept a 1920x1080 signal and automatically display it in 16:9? 16:9 native mode content should really be viewed on 16x9 displays. You guys probably run everything through your OS but not everyone does.
This was my reasoning for 16:9 for years.
Yeah sure 16:10 gives a higher resolution. But the perspective given in game does not draw a taller image. It simply takes away from the sides. I want to see as much as possible so 16:9 it is.
there's usually ways around it, widescreen gaming forums is a good place to find methods of getting custom resolutions out of games
and 1280x1024 is pretty close to using as much space as is available at 5:4 if all else fails
>But how many of those 4:3 monitors can accept a 1920x1080 signal and automatically display it in 16:9?
You are not making sense. We are talking about computer monitors where the graphics adapter sets the resolution.
>claims 1920x1440 is a common resolution
>posts an image that doesn't list 1920x1440
I'm well aware that LCDs and many other panel displays have fixed native resolutions and anything other than their native resolution is resampled but that has nothing to do with CRTs displaying the extremely standardized resolution known as 1080p.
And even then a lot of TVs and monitors have features to allow the user to display the source to-scale or to 'pan-and-scan' or whatnot. Depends on the manufacturer and if the tube is feature dense.
the crt pc monitor i used for years normally only went up to 1280x1024 (only much later did i mess with things to get different resolutions above that)
to be honest i've never been annoyed by any differences between 4:3 and 5:4, i've always considered them to be close enough to each other, at least for gaming purposes
also i don't think i've actually gotten stuck with using 1280x1024 on my current 1920x1080 monitor, there's always been some way to get the game to work at native res, sometimes not without stretching some UI elements, but that usually doesn't matter much in 3D games
Well I doubt you'll see PAL in usage outside of Europe. Those vertical lines are better suited to what the CRT's over there were initially capable of. Yet with any high-resolution-capable CRT you can fit a PAL signal with a 1:1 scale ratio as PAL within the screen without digital conversion, 16:9, 4:3, or otherwise.
CRTs have analog horizontal display lines, which means you could virtually fit infinite pixels in each line, but the logic circuitry that determines a pixel's shape usually doesn't support more than a certain amount, and the actual pixel shape is determined more by timing (an analog to digital conversion) than by digital limitations.
>any midrange monitor from the early 2000s can display 1920x1080 in 16:9
I have issues with this answer, and expect many anons attempting to connect their PS3s to such monitors based on that answer would as well.
Also, technically you can also get a 4:3 monitor to display a 4x3 resolution using the entire screen without data loss, and you can even get PCs to output this signal and any other 'custom' resolutions you want, both within windows and within graphics drivers' control panels.
As for WHY the industry has chosen to always market even CRT monitors with compliance to the standard (and not let others know it's capable of more) is because their inputs probably didn't accept a signal that supported that resolution. If it was a monitor it would. Generally though, the higher the resolution the noisier the signal gets, so you could only take analog VGA to so high of a display resolution without quality loss, even if the CRT supported more. And also because 1080p is a standard, of course all consumer oriented hardware is going to try and market towards the deadlocked target and not try and be better or different. Besides, there's compatibility issues when you deviate from habit because people don't like change.
normally yes, though interlaced modes aren't impossible
interlacing was just a kind of trick to get more effective lines of resolution out of limited bandwidth, something pc monitors were designed to have more of from the beginning
intentionally using interlaced modes could be handy for viewing interlaced videos though, provided you lack the resources to do reasonable software deinterlacing
CAD, 3D modelling, Image processing, Video editing, Industrial control panels which require a fuckton of space on the screen, Medical Imaging. And anything else which might require a bit more screen space or sharper image.
also, if you are playing interlaced videos, make sure you use a mode with the exact same vertical resolution as the video
interlaced video won't display correctly unless the fields line up and are drawn in the right order
I think it means that all of the horizontal resolution is compressed to display in 16:9, so you get a better picture as opposed to a tv that uses lines to draw the space above and below the letterbox.
My understanding is that it'll only help you if you have a video signal that is actually in 16:9, so if you have a game that is 4:3 letterboxed, you'll get a squished picture.
TVs I guess are fine since 15kHz. But computer monitors there was no reason to go LCD since we already had higher resolutions and refresh rates. By going thin we limited ourselves to one resolution at 60Hz?
i don't really think thinness is the only thing
lcd's are nicer for static content where contrast/color accuracy isn't as important (unless you have an expensive as shit lcd with better support for those)
for the reason that geometry is naturally perfect, they're sharper, and they don't flicker at all (if you can do 85-100hz+ at high res on a crt, then that's nice as well, but no lcd flickers)
i'd rather browse the net all day on an lcd than a crt
just like i'd rather play games on a crt
they each have ups and downs
I have a 1080p@120Hz LCD monitor with 1-2ms response time, and good color reproduction. Still would like a nice CRT monitor and TV for retro purposes though (and second screen goodness).
Looking into research on SEDs pretty much answers that (nanoscale electron emitters). Of course that probably wouldn't change the necessity of large depth for electron magnetic guidance if you're doing a classic CRT display style.
i know some lcds can approach crt's advantages, but that support doesn't come cheap (and i'm too poor for that, so i just use a bit of both)
that said, crt's still do low res stuff better
for retro purposes i'd get a crt tele, pc monitors have a smaller dot pitch and won't do less than ~400 lines without doublescan, which make them less suitable for low res games (depending on preferences, as well)
And seeing how the glass on a CRT would be too thick because of depressured gas containment, I doubt you'd be able to get a thin profile as well even with SEDs. Also is the fact that CRTs have static electric buildup, and if you were to use a CRT on a mobile phone you'd have lots of problems finding a tech to enable static (afaik).
I have a watchman in the house. I could pull up a timestamped pic, but tbh it looks more like there's a projector at the bottom of it. You sure it's CRT? I don't want to risk opening mine up.
P.S. my model's like the one on the left. Would take a pic but I'll have to wait until I get back from doctor's later today.
This is not illegal? Thought you had to go through the FCC to broadcast anything in that range anymore. At the very least I might jam any future techs around my area (I live in a city, not bumblefuck nowhere).
Feel free to post about it as I get ready to leave and I'll come back later to read. :)
Yeah. It basically incorporates a tuner to get the desired channel and a demodulator to extract video and audio from the channel.
The video circuits should be simple compared to a color TV as it doesn't need to process chroma to get color.
So, adding video and audio input might be simple in theory. The only obstacles are the space, how to mount the connectors and how to (safely) connect the inputs to the circuits.
>Feel free to post about it as I get ready to leave and I'll come back later to read. :)
well, i mean anything that has RF out, that is, can output video/audio through the came connector used for the antenna
most older consoles had RF out either integrated or with an adapter
the tv might have an antenna-in port (i used to have a pocket tv, but it was a little newer and lcd), it had an antenna port in the form of a 3.5mm jack
just a matter of hooking that up to a consoles' RF out wire (or straight onto the antenna if there's no such port)
pic related, playstation rf adapter
on a partially related note, i used to have a 14" tele in my room when i was younger, which only had an antenna port. i modified a playstation rf adapter so i could hook up other consoles to it (the adapter actually just takes 5v and composite video/analog audio and converts it to RF, any composite source will work)
I just got this back into my flat, bought my 32" LCD because I get a pink left upper corner when playing games with a blue background. Already tried to fix it with a strong drill, didn't work.
This thing won't be here long, but is it also a piece of shit? SEG Cortina-S II
>paying for something people throw away on a regular basis
Sorry about rotated image. Apparantly just because it's upright on the phone doesn't mean it is while being stored. Hate that.
Is it Watchman time? Pulled mine out of the closet, and can no longer find it's power cable. Oh well, time to go find a replacement.
Why on Earth does an '85 model have more features than my '92 version?
Forgot link: https://www.youtube.com/watch?v=7wokgD0MiKU
<- Some more details too.
Why the hell does it keep rotating my photos to landscape when I took them in portrait, moved the photo to my PC, checked that it was portrait, and even rotated it manually 4 times to make sure it stayed portrait?
Can't change (Yes I have an 8 metro app for exif, shut up)?
Hey /vr/, pulled out my little 5 inch B/W CRT. Gotta say, Kinda neat I can still actually see what I'm doing.
4chan removes exif data anyways.
iirc it actually re-does the entire god damn image. in case you haven't noticed, colors sometimes fuck up because of that on images uploaded to 4chan.
That would explain a lot of failed checksums I'm getting using 4chan Grab (when downloading images) and missing results on Foolz (when searching for old images I've uploaded using the checksum of a more recent upload).
>FOR WHAT PURPOOOOOOOOSSSSSE
Not retro but covered under the CRT topic.
I can't seem to get my PS2 using external sync through component.
I've heard people say it is sync on green.
Does this mean I plug green into the sync on my monitor? Or do I have to split the green sending one to green and the other to sync?
Internal sync is causing a bit of shaking in the picture.
Tried wiggling around the converters, tried changing up the DVI converter (VGA to DVI-A this time, the one before was DVI-I) but still no luck. Taking a closer look at the converters however I found something strange; the DA-15 to VGA converter is missing pin 9, see pic. According to Wikipedia it's "KEY/PWR formerly key, now +5V DC". I'd assume thats what's causing the problem but then why does it work fine with the G3?
I'm thinking maybe the pin has gotten stuck in the VGA port of the G3s gpu. Then again I'm not sure that would actually work and im probably being ridiculous. Either way I'm thinking of taking out the gpu (ATI Rage 128) from the G3 and installing it in my main pc, but then my question is will it work? Can I use my 560ti for my LCD screen while at the same time running the 128 for the Mac screen? My mobo has 2 PCI slots aside from the PCI-E so it would most likely fit.
Thanks for all the help
I know I know, but since the VGA converter works to the G3 I'm guessing it's the VGA to DVI converter whos at fault
So you're saying that when you connected the DA-15 monitor to the G3, you also used the DA-15 to VGA converter as well? I imply this because you said the pin was missing on the CONVERTER, and not the MONITOR'S CABLE.
Also, DVI-I handles both DVI-D and DVI-A, so not too surprising why DVI-A didn't work, but good effort, and at least we know it's not the DVI converter at fault.
If you're really going to plug the G3's video card into your PC, make sure that your PC supports PCI, as most new motherboards just don't anymore.
So if I get RF working and can get a VGA->RF converter and all that, I'm interested in testing outputting from my PC to the monochrome display.
Also, I'm sure >>1794636 would benefit from this as well if he decides to connect to PC for emu purposes.
The thing is that I don't see support for monochrome displays in neither Windows nor my driver's control panel. Do I have to physically be connected to an analog source first, for those options to show up? At the very least I should be able to switch color mode to something below 32bpp even for my digital LCD (like for compatibility modes).
B/W CRT guy here. Mine actually takes RGB, so I can just hook up my consoles directly. For curiosity sake, how would I hook it up to my PC through RGB?
I mean I guess I can always set up a VM with a retro OS and allocate it the analog display and let it manage it, but I'm fairly certain with the right software/know-how you can get various display modes working in modern OSes. At least in Linux you definitely can.
Get a VGA/DVI->RGB converter I would suspect. Anything that preserves the data of the pins is okay. The only problem you might have is your PC probably wanting to output in a Progressive mode, which doesn't work well above 240p and NTSC refresh rates (USA) on 15kHz Horizontal Scan-Rate Monitors (Ordinary CRT Televisions). So you could always switch the resolution mode to 'interlaced' in your equivalent of the control panel. Make sure you look up info of your CRT to see the maximum resolution it supports. Apparently most TVs support higher resolutions than consoles or broadcasted TV allows.
Also, you sure that's RGB? Looks like monaural composite to me.
Another tip for mono sound is switching the sound output balance in the volume control panel in windows to only output to the left (or right) speaker, depending on which one you use. However, you probably would have/want your own PC speakers connecting to your PC directly, and using the sound from that than using the sound on the dinky mono speaker on the TV.
Hey guys, my Grundig CUC5360 has a bit too much red in the image.
Does anyone know how could I fix that?
Also, the image looks marvelous (pure RGB) but is off-centered.
I've read that there is a "service code" that you can input with the remote and move the image around and change the R/G/B balance.
Does anyone knows it?
Does anyone know how could I find it?
Googling has given me zero results.
Grundigs have been made mostly in Germany. I guess a german-speaking bro could help me?
yep that's right.
I really just wanted to try a different converter regardless if it was DVI-I or DVI-A just to see it the one before was damaged or something
the mobo has 2 PCI ports but I'm not doing it, I put it in and the fucker blocked 3/4 of the 560tis fans, fuck that
>actually takes RGB
all of my wat
I think that what you see is luminance (the very same as used in S-Video), but i bet that a lowpass is applied to that input to get rid of the chrominance as you can connect plain composite without getting a moire pattern over the screen.
>For curiosity sake, how would I hook it up to my PC through RGB?
You have to convert H- and Vsync into Csync, then add the specifically mixed RGB. This will result into luminance.
Or the simple way, you don't. Just use the composite/S-Video output. It isn't that it would matter very much, as this >>1794636 doesn't look that very great.
Yeah that's right. Older video cards actually had component/S-Video and other outputs besides just VGA/DVI/HDMI/etc. So obviously if your PC is old enough to have that then go for that. My computer doesn't though. The only analog output it has is VGA, which thankfully isn't deprecated entirely yet.
So my old Trinitron XBR from the 80s has this weird thing where when it's turned on for the first time in a few hours it'll have the colors go all wonky - see the attached picture of the (normally white) Wii dashboard.
As you can see it's almost like the colors have become inverted. Anyway, it's not really a big deal because giving it a slap on the side fixes it, and it goes away by itself after being turned on for five minutes or so. I'd just sort of like to know what's causing it (though I'd imagine it might have something to do with the TV being nearly three decades old).
Man, I hate Apple products. They can't even stick to a consistent standard across a few years. Based IBM PC Compatible support.
What's your budget like? Can't you just abandon that old Mac monitor and get something decent without a fuss?
I wouldn't slap it like that; it's not good for it
I think the phenomenon you're seeing is related to degaussing, but it shouldn't last that long. Sounds maybe like something went wrong with the coil? I dunno.
I have no idea. If slapping it on the side fixes it, then some component's probably loose inside due to weak solder joints. If it's really from the '80s you should consider replacing most of the electronic components with newer ones with better materials.
I hear ya man and the computer this monitor originally went to didnt even have DA-15, it had to use a DB-25 converter. Fucking Apple.
Absolutely. I don't know how much a good old CRT monitor would cost but Im pretty sure I could pay it, I'm just a cheap bastard and want to save every penny if I can. Id like it if it could connect to both my main pc and the G3 but I'm guessing that wont be hard
Google tells to me that this is an oldschool Grundig from the early 90s, also not total garbage.
>Also, the image looks marvelous (pure RGB) but is off-centered.
That's fucking normal for most SCART TVs. Either adjust the TV to this and get a off centered composite as side effect.
Or make (possibly buy) a RGB shifter which can adjust the center externally.
>I've read that there is a "service code" that you can input with the remote and move the image around and change the R/G/B balance.
I own 3 Trinitrons.
My nice KV-C2521D (1991) as main TV and KV-14M00D (1993) stored at the attic. Both don't have a service menu, adjusting requires disassembly and twisting the trimmer pots.
And a KV-M1450D (1995) also in the attic. That one has a service menu and a non-automatic 16:9 mode.
So i think that you can adjust the red drive on the neck board (pic related, should be the same as yours).
I think 1 of the trimmers might that what you're looking for, and i'm sure they appropriate labeled on the other side.
>Grundigs have been made mostly in Germany. I guess a german-speaking bro could help me?
Very true in the 80s and earlier. And yes i'm trying to help you, right now.
I have two old pentiums in a closet. I don't have a good crt monitor yet and my wife would kill me with how much space I already take up in our apt.
I have a wii on component for some emulating some stuff. I've been mostly sticking to the stuff I actually own recently though.
Nothing in that pic cost me over $40. Just keep your eye on craiglist always and you find stuff cheap.
Something similar to this for the former (can't vouch for quality buys though). But if you also wanted to share keyboard and mouse you should probably look into KVMs. I have one or two sitting around in the house.
also KVM seems pretty cool, might have to check that out as well
guess since I'm here anyway, what would you say I should expect to pay for an old CRT monitor? what's too expensive and whens it cheap?
Thanks for the help
>Google tells to me that this is an oldschool Grundig from the early 90s, also not total garbage.
Yes. Only RGB able TV of 29" that I have ever seen. I bought it last year because of this.
>That's fucking normal for most SCART TVs.
> Either adjust the TV to this and get a off centered composite as side effect.
This I'd like to do, moreso because I own all NTSC systems and PAL composite doesn't do shit for me.
>Or make (possibly buy) a RGB shifter which can adjust the center externally.
Make could be, buy would not.
>Both don't have a service menu, adjusting requires disassembly and twisting the trimmer pots.
Ah, crap. I should man up and dissassemble my TV someday, then.
Depends highly on the build quality and the rarity, since they're all not made anymore, but luckily are still in high supply in thrift stores and whatnot. My suggestion is to go to your local thrift stores to get a general idea of market price, and then try to find specific models online. Generally you should know what you want from a CRT. Ones with high phosphor quality are the best at color reproduction, and something that intrigues me personally (as well as resolution support).
You could probably find an old CRT monitor in the trash.
Pricing depends entirely on what you're looking for and where you're at. If it's an FW900, then expect to pay quite a bit.
I'd say don't pay more than 50 dollars or whatever your equivalent in local coins is.
And don't pay anything if it's not in mint condition.
>I imply this
You inferred it.
>Also, DVI-I handles both DVI-D and DVI-A, so not too surprising why DVI-A didn't work, but good effort
DVI-A is what one should expect to work, and DVI-D is what one should expect not to work.
>Damn it why does it always flip them?
Because moot made another retarded and wholly unnecessary change live on the production site without testing changes on some beta site. Again. As is normal for this unprofessional website that I love so much.
>Man, I hate Apple products. They can't even stick to a consistent standard across a few years.
DA-15 lasted for over a decade and really is pin-compatible with VGA when it comes to using VGA monitors on Apple computers. The other way around, they justifiably had no reason to care.
Pointless playing games that weren't meant for monochrome on a monochrome monitor.
If you just want everything green then you're better off buying one of those Philips monitors with a button that turns everything green. On a somewhat related note, what were those originally for?
A wild KX-14CP1 appears!
Just bought my third Official PS1 SCART lead. Help, they're so pretty.
Not really, though chips are kinda shitty. I've yet to find one (sans maybe a DMS4 S.E Pro) that works as I desire. Discs have to be patched as Master Discs too - but it will boot media from all regions without issue.
I picked up 4 of these for £30 each, sold one to a friend for cost price and the other two for £60 to community members.
I'm moving into my first house soon, and I'm going to have my own game room. Got all these to work with, plus an old floor model that I'm not sure works.
Wicked pumped to put all these tvs to use again.
>I've yet to find one that works as I desire
Not sure what you are looking for, but I think Matrix Infinity might be the best choice considering that it can play imports and convert their signals (or even speed up PAL games to work as PAL60, force VGA SoG output etc.), adds native support to non-Sony memory card (those 16/32/64/128 MB) and it has no problems with FMCB whatsoever
If only it didn't have those garish splash screens. The perk of buying all 4 machines is that I got to choose to keep the NTSC-U machine (EU PSU - DTL-H30101 E) over the other 3 PAL machines. Therefore NTSC PS1 software works correctly which isn't the case of PAL, perfect for my usage. Otherwise I don't really care about those other functions, although I know some people do.
I'm looking for a small CRT, like, 13"-16"ish, does anyone have any good recommendations as far as brand or models?
The only ones I've seen around locally are Orion, and I heard those are not very good.
Yeah, Trinitron. I have two 13" KVMs one's from 1987 and the other one is a Wega from 2004 they're both sweet but not as sweet as my 14" PVM, though it's a lot bigger. The Wega is pretty deep though.
Hey peeps. I've been lurking vr for a while, and finally saw a decent CRT on the side of the road that I could plug my snes, n64 and wii into for some solid retro time.
It's a Panasonic TX-68PS20A. I cleaned all the connections with cotton tips and works fine for the old red yellow and white.. But the picture for component for the wii is all mixed up. Any tips for trying to fix it?
I've got the wii running on RCA now on 50hz which gives the correct screen orientation on 4:3 but it's flicker is giving me a headache. Is it worth persevering to get a component CRT? You've brainwashed me into wanting a sony trinitron.
So far links awakening dx running off the home brew channel looks and sounds amazing compared to LED, but the flicker is killing me.
Tl;dr any easy way to fix component input? Is it really worth it for retro when emulating from the wii?
Thanks! Pic somewhat related.
>the picture for component for the wii is all mixed up
Picture/video pls. Perhaps a proper description too.
Your post isn't quite clear whether the flickering is composite only or also with component.
>If you just want everything green then you're better off buying one of those Philips monitors with a button that turns everything green
Color CRTs will still have an aperture grille or shadow mask, even if you turn off two of the beams. Mono CRTs are completely continuous - the screen is a single even coat of phosphor with nothing interrupting it. They are incredibly clear and sharp at high resolution; if you've never seen one, go remedy that immediately.
>On a somewhat related note, what were those originally for?
Switching the beams on/off individually was to help calibrate the color decoding circuitry. If you had two "reference" colors in a signal, and you knew that they should have the same amount of (for example) green when decoded, you could then adjust the phase of the color decoder until that was true on screen.
Interesting development. It was blurry on component because it was set to 480p on the wii (I think). This is what component looks like at 576i and 480i set on wii (pic related - colors are all wrong even though they're plugged in correct). Any suggestions?
Composite works fine for everything.
Two pics incoming. These are the output options for the wii. Component works for the first two but the colors are messed up.
This is what component looks like when set to edtv/HDTV 480p. Sorry I'm on a crappy ipad.
Pretty much all PVM's have composite, though they're usually BNC connectors. They can be easily rectified with an RCA to BNC adapter. I got 10 of the things for about $11 off of Mouser, and I know RadioShacks in the US always have them, though they usually charge like $5 for just one of them.
I wanted to make sure I got good quality adapters so I bought mine through Mouser.
Since Mouser didn't have any male to male SCART cables I did take a gamble on the Acoustic Research SCART cables on Amazon and they turned out to be fucking awesome. With my powered Bandridge SCART switch, I should be able to play games on my PVM and record video once I get video capture hardware.
Aussie so yeah probably the same thing. Even playing with the video settings in the emulators causes the same result when I set it to 420p.
The wii works fine using component on my widescreen led, so I think the component jacks on this tv I picked up might be screwed?
So close to joining the emulation master race.
>The wii works fine using component on my widescreen led
The LCD TV probably accepts RGB or component. The TV only accepts component.
The black hearts in the screenshot are a giveaway. They're being output in RGB, so the green and blue intensities are near zero. This means that the TV is getting near zero on the luma signal (green plug) and making it nearly black.
Set the Wii to output component, not RGB.
This cable. Don't think it's an official Nintendo one from memory but it has worked okay with the LED.
There should be a "No Signal" message displayed in the bottom left of the monitor.
I know there's a way to keep those notifications on all the time but if you aren't seeing that then maybe there's a way to turn it off all the way.
Check the menus.
Yeah, it's normal. I also freaked the fuck out when I first turned my PVM 1444QM on, but I saw the guy who sold it to me test it so that calmed me down a bit.
If anyone has the service manual for it, though, that'd be much appreciated.
would a third party component cable be that much different from a first party one ?
it's for my ps2, nb for non /vr/, but its the most correct place to ask.
Thanks for suggesting them! I was able to confirm that it was the first SCART cable I had and not the switch that was causing interference. Once I find the right TV stand and a second SCART switch I'll have to set up my rolling blackout.
Some of them are. I bought this 3rd party PS2/PS3 component cable from a local game shop and it works great for both systems. I am curious about getting a hold of component cables with composite out on them and testing RGB over it.
PS2 generic component cables are pretty okay. I have a big fat multi-cable that has composite, component and s-video for SNES/GC/Wii/Xbox/PS* and it works great as long as I only have one console connected to it at a time and only have one of the outputs plugged in.
I've got the same exact one. It gives a good image without any visible interference. A bit anecdotal, but the first pair I bought had the plastic covering on the system side come apart before I even got it into the system. Wrapped the replacement with electrical tape just to be safe.
for trinitrons, as long as it is a full analog set you will be good. if you end up getting a digital trinitron then you will need to buy some sort of expensive converters unless you are using a PS2 or Dreamcast in which case I STRONGLY suggest you use S-Video with optimal performance but pictures will be a tiny tiny tiny bit fuzzy.
And, to cover PVMs, there's a lot of variation in their support. 14" and smaller ones will often be composite only. There are even very common monochrome security camera ones and I'm sure you don't want to play your games in black and white. You'll also want to consider how high you're going to be taking your inputs. All PVMs are outstanding, even with composite though some composite signals are shittier than others (SNES). You'll probably find the best combination of prices and features around the 20" size and they're still small enough to be delivered by normal FedEx or UPS instead of freight. I'd go as far as saying MOST PVM-20s have composite, "s-video", "component" and RGB while 26"+ ones almost always lack "component" (YPbPr) plus the 5-bnc connections on smaller monitors are better to work with than the DB connections bigger ones have and a lot of people say the quality is lower, although that may just be perception since imperfections are more visible at high sizes.
>14" and smaller ones will often be composite only
Mine has CMPTR (analog RGB), VTR (digital RGB), and S-video in addition to composite.
Do PVMs with only composite input really exist? Sounds weird as heck. Not very "pro" at all.
I'm picking this one up in a week or two, it's a 13".
I'll probably be using RCA cables, the oldest system I have is either an atari 2006 that I got pretty much brand new looking(except for the old box) or the SNES
should I stay away from large TVs, say 28"-36"? because I found a 32 and 36 inch trinitron thats cheaper than some smaller ones on craigslist.
Pretty sure Atari was only doing Flashbacks by 2006.
captcha for this post: crtspol jaiosdf
You got a source on VTR being RGB? Its not that I don't believe you, Anon, but all I can find about it is the pinout for the cable, and no mentions of rgb at all - only that the 8 pins are Video in, video out, audio in, audio out and 4 grounds.
I think I'm guilty of adding to the confusion on this one.
When marked VTR, it's just analog video. But the jack is also a common digital RGB connector. So anyone with a retro computer background is automatically going to assume that's what it is.
Sorry about that!
hey guys, im looking for a good widescreen crt to play quake 3 on
do any of you know about any good specific models for this? I know about the FC900 but im looking for something cheaper
So I have a graphics card with a S-video, VGA and DVI outputs. Which output would be best for a CRT in an emulator setup?
S-video is obviously the easiest since it's just one cable with no conversion, but is there any downside to say a VGA-to-RGBSCART set-up? Is there any reason to even consider the DVI?
You might be able to use drivers to get the VGA port to output 15khz RGB and just use the apropriate cable to hook it up to an RGB monitor. Hooking straight up to a VGA CRT monitor is also a good option.
>You might be able to use drivers to get the VGA port to output 15khz RGB
I've found a somthing called "soft-15khz". People with my cards seems to have positive experiences.
>just use the apropriate cable to hook it up to an RGB monitor.
Basically something like this?
>what card is it?
Nvidia 8500 GT
those cables are not what you want, they're for different things
those are wired for scart *out* to vga *in*
it's for connecting scart gear to certain kinds of projectors
that said, there are sutiable cables floating around, look for ones made for "arcade" use
it's all good, just remember that connecting a pc to scart isn't a common thing to do, so you won't find cheap manufactured cables, only custom made and specialty ones
this is somewhat common around home arcade/mame enthusiasts though, as tv's via rgb make a good alternative to arcade monitors
I have my sources. eBay and also community forums. I've started buying them when I see them though...
So what kind of horizontal scan rate does that monitor support? If Carmack is running his computer at 1080p@85Hz, then it should be at a minimum of 85*1080=91.8 kHz. This is even more impressive because of the widescreen aspect ratio. What's the maximum horizontal scan rate ever manufactured for CRTs on common aspect ratios?
And that's probably not even it's maximum.
There are CRT-monitors that go well above 100Khz however. Like the FW900 easily puts out 1920x1200@95p. Only problem with these beasts is that they shipping costs are usually in the hundreds of dollars range.
Hey guys, I've just come into possession of this tv. I don't know what to make of it. The back has a lot of nice inputs, but it's a DLP so I have no idea whether it would be decent for old games. I'm going to hook it up and try out my NES on it when I get a chance in a few hours. That might decide it. It's widescreen so some of the screen is a waste. I have no room for it.
I'm strongly considering just selling it but....it is 42 inches.....
I'll go ahead and include a stock front image right now. It's pretty dusty and shoved in a corner, otherwise I would take a pic of the unit I have.
Hey /crt/. I got a PVM recently and at times there's some pretty heavy streaking via composite. I'm talking if there's a text box, there's a slightly darker streak moving out from where the text is across the width of the screen, or if there's something bright on a dark background there's a visible streak of color, etc. I don't have the cables to try anything else right now so I'm just wondering, is this possibly an effect of the composite cables or is it definitely a sign the tube is on the way out?
As it seems that the old Dell in the attic doesn't want to cooperate, would it be possible to get soft15khz or any of the other similar options working with the Intel 950GMA inside an r60 Thinkpad? Linux would be fine if it would mean I could get a proper 15khz signal; I'm not picky about OSs.
The real bummer is I won't be able to use my old ATi 9600.
Most likely it's going to be as badly suited for retro gaming as modern HDTVs. It's to do with analog-to-digital conversion and upscaling and all that wonderful malarkey.
To be sure, try playing Duck Hunt on it. That games becomes impossible to be on new HDTV due to the previously mentioned latency issues.
I've been trying to find a CRT projector for a long time. No luck though, as there were very few consumer-oriented models. I don't even know where I'd look for commercial surplus or whatever.
I've got a Dell Mini 9 running Linux and I can drive a 15KHz monitor just fine.
This is where I've found a ton of shit on CRT projectors. Dude also sells some old ones, but he only ships on palettes and it's 200$+ for shipping.
I know it's not retro (enough) but you guys are the only ones that would know. I can't ask /v/ or anything.
I can't decide whether to hook my Gamecube and PS2 up to the CRT or the HDTV. I've got component out for both and I've been comparing. Things look a little jagged on the S-video, but nice and bright, while they look smooth but kinda...I don't even know, dull, on the HDTV
You have a slight advantage on an HDTV with Gamecube because a good amount of its games support 480p (I think you need to hold a button combo on the GCN to activate it but if you're using a Wii set to 480p it's automatic), but so few PS2 games support anything higher than 480i that it's not worth it. Bear in mind even with 480p you're still upscaling it by a rather large amount to your TV's native resolution.
If your SDTV supports component I would say that's your best bet, but since you said S-Video I'm going to assume that's the best your CRT can offer. Even still I'd say the CRT is better off since that's what the consoles were made for, but with 6th gen consoles it's kind of a wash. The unfortunate thing about the 6th generation of consoles is this is when developers started using mostly interlaced resolutions for their games (Sans the Dreamcast, which was native 480p for everything) and anything 480i is ugly on every TV, and honestly, your best bet for both GCN and PS2 is emulation at high (and most importantly) progressive resolutions.
Well that's promising.
Any pointers or links on figuring out how to go about doing so? I don't have much experience with Linux.
Also contemplating on whether I want to tear the VGA cable off of an old broken NEC CRT I have in the attic, and use that as the basis.
>your best bet for both GCN and PS2 is emulation
I'm looking for information on why my CRT monitor is making a high pitch sound and what I can do about but I'm not turning up much on how I can fix this. I'm willing to take apart my monitor and repair what ever needs to be fixed to get rid of the noise.
>The change in pitch can be from thermo-mechanical changes within the enclosure - say, a loose transformer gets looser as the cabinet heats up. Try whacking the cabinet :-) Sometimes a bad cable connection - even an exteranl one - can actually result in noise.
Giving it a good slap actually made the noise go away for a few moments and made it quiet down shortly after. Looks like something is loose then, maybe I should re-solder some shit inside of it.
My main crt that I use for older games has a pincushion issue(middle of the screen is stretched outward more the brighter the contents of the screen). I can hear a faint chirping when it isn't turned on, so I think a capacitor might've died. Is there any hope for it, or should I start looking for a replacement?
the late model trinitrons rebadged by dell, ibm and hp etc can do 130kHz horizontal, 170Hz vert, 230MHz bandwidth.
i used to have 2 of them on my desk for a few years
widescreen crt uses pretty much the same scan rates as 4:3 so scan rates are not a problem, but you need more bandwidth in order to cram in more pixels per line - if you don't have the quality analog parts to back it up (vid card/cable etc) then you gonna get horizontal blurring
i think somewhere around %30 of the bandwidth is consumed (wasted) on blanking and sync only so that takes a huge toll on your quality when you start dealing with very high numbers on your crt.
What is the newest CRT you guys have ever seen?
I have one from December 2007.
the S-Video was a typo, I think I was trying to type SDTV.
I have a Panasonic Tau for my CRT, it has component input. All the 4th and 5th gen consoles are hooked to it via S-Video. The gamecube (tried Viewtiful Joe) looked a little jagged on component on the SDTV but blurry on the HDTV. I think it was only running at 480i though.
how old are PS2 and Gamecube gonna have to get before someone makes clone consoles designed to properly upscale them to high resolution digital output
hell, how long til we get that for the fucking Wii
I think he means consoles that will render video in HD like you can do with some emulators.
Also, Framemeisters are overpriced as fuck for only being able to upscale and make fake looking scanlines. I'll stick with my nice Yamaha AV Receiver that I got for cheaper that can upscale 480i to look good as it can on a 1080p LCD, has multiple HDMI, Component, S-Video, and Composite inputs, and lets me also have 5.1 surround sound.
>I think he means consoles that will render video in HD like you can do with some emulators.
Because clone consoles, including those piece of shit retrons, are just big overpriced android boxes running your carts through a nigger-rigged emulator. N64 and onwards are too hard to emulate, ergo it'll be decades before it's viable to make affordable clones for them- at which point no one will give enough of a shit to do so.
Also the main focus of those xrgb scalers is 240p content; specifically, upscaling it at the lowest possible latency. 480i and deinterlacing, sure you can certainly get better results elsewhere. I personally use a mini for multiplayer stuff, and enjoy singleplayer games on my 14" BVM.
fw900 had a couple of rebadged versions, i think HP made one at least.
they were professional monitors, they weren't intended for the average pc user being that they cost more than the average pc at the time.
im not immediately aware of any other wide desktop crts at all, you have to understand that in the era just prior to lcd's everywhere, the typical desktop crt was either 17 inch or 15 if you were on a budget. 19/21 inch crts were expensive so a big wide tube would be even more so.
Which of these two monitors is better?
A PVM-14M2E or a BVM-14 M4DE? I've found some guy selling them for €60 each.
I've found this document online, but I'm still not sure which to choose. http://www.pmotions.com/Public/PDFs/SONY/Displ_00.pdf (pic related)
Ah, okay. The SDTV was indeed playing it at 480i, as that's the highest resolution that kind of TV can go, hence the "SD". There are CRTs that can go up to 480p and above, but that's a whole can of worms right there. Nevermind finding one.
At this point you're better off just doing what -you- think looks best. Personally, I think 480i on a CRT with component looks way better than 480p upscaled to 1080p on an HDTV (And especially 480i upscaled to 1080p). Like I said, ideally your best solution for playing 7th generation games on an HDTV is emulation, since you can natively play these games at high resolutions and they play pretty well so long as you have the hardware for it. As far as real hardware goes, though, I would stick with playing them on a CRT. Especially the PS2.
BVMs are usually better, but what you gotta ask yourself is
A) Does it have YPbPr?
B) Does it have RGB?
C) Do I agree with the price?
D) Can it do 480p?
Try to answer yes to at least two of those.
The BVM is quite superior in resolution but it lacks manual degaussing, which comes in handy when some magnetic field fucked up your colors and you wanna make it normal instantly.
You need to let it cool down for 15-25 minutes to trigger the automatic degaussing, this applies to color TVs in general.
The other thing you have ask yourself is what kind of tube do you want as the resolution and the phosphors are the biggest difference.
Tnx, next question;
They both have YPbPr input so being from Europe most of my consoles are SCART. I know that there are adapters for RGB SCART to BNC and that I need that. But does it need sync cleaner or not? So in other words; This cable http://www.retrogamingcables.co.uk/sony-pvm-scart-converter-bnc.html or this one http://www.retrogamingcables.co.uk/female-rgb-break-out-scart-to-4-x-bnc--2-x-rca-for-sony-pvm-monitors.html
The degaussing is one of the points which I am doubting. I already have a PVM 2130QM which is a bit fucked up because I put a speaker to close to it and I don't know how to degauss it.
>which is a bit fucked up because I put a speaker to close to it and I don't know how to degauss it.
Then you might permabroke the tube (aperture grill bent) if the degaussing coil can't get it back to normal.
Do you hear a kinda high pitched noise when you cold start it?
And not if you turn it on right after being off for a few minutes?
Might be a possibility that the degaussing circuit is just faulty.
It seemed to fixed itself. I didn't really payed a lot attention to it.
I think the aperture grill is bent as you can see in the picture (the top is a bit dented and the image is a bit shifted to the left).I know this isn't a pretty photo but it shows the shift. There is nothing I can do about that right?
>the top is a bit dented
Looks quite normal to me. But i only dealt with consumer TVs.
>bit shifted to the left
Absolutely normal for SCART RGB. I made myself a external RGB shifter (displaces the picture horizontally) as i use composite for NES, so adjusting the TV will also adjust composite/S-Video sources.
But seriously, the purpose of the aperture grill/shadow mask is to 'mask' the electron beams so that they will strike the right colors. If they aren't spot on then you will see wrong colors at some spots.
What you complained about is the geometry which might be adjustable in some manner (service menu, trimmer pots or dials at the outside) and these related issues don't have much to do with the bare internals of the tube.
However do you only use RGB?
Then try to adjust H Cent, as you pic looks quite nasty with the black bar at the right.
However to correct your little dent at the top, i'm sure that you can't correct that without opening because the permanent magnets between the tube and the yoke are maybe misplaced.
But i could be wrong with that.
Not really. Most flat screened CRT's are of larger size and many have slimmer depths. Larger CRT's on average have more geometry problems than smaller screens, and the ultra slim sets where they place the electron guns as close to the screen as possible also have geometry problems. Flat screens have a reputation for geometry problems, but it's not because they're flat screened.
tldr; Flat vs Curved screens don't matter. Size and depth do.
PVMs are shielded precisely so you can stack them with speakers or other monitors and not have distortion. Did you have its steel case off of it? Also, so far we haven't found a comprehensive guide to what PVMs accept Composite Sync and which don't so unfortunately you just have to find out for yourself. You should be able to use a Playstation or Gamecube plus US style YPbPr component cables that have composite too and some RCA to BNC couplers. I don't know how common YPbPr component cables are there though.
That dent was worse, that also seems to be fixing itself for some reason.
>However do you only use RGB?
Most of them are, only a Xbox and Wii use component (that is why I was looking for a extra monitor)
>Then try to adjust H Cent
For that I have to open up the monitor and fiddle with some screws and I have no idea what to do. So I most likely end up electrocuting myself.
It is one like in the picture, I think you mean that by steel case?
I fucked it up by moving the speaker in front of the monitor. And I'm using old Technics speakers that my dad used for his lp player. Tho I'm not sure if that makes any difference.
> US style YPbPr component cables that have composite too
I think that YPbPR cables are the same here, not sure tho. I also bought someRCA to BNC couplers from eBay when I just got the 2030 (which I am not using at the moment)
>That dent was worse, that also seems to be fixing itself for some reason.
Yeah, think about it. These monitors were intended for longtime use, so it was likely calibrated when it was warm (on for >1h) so it would be correct in the end.
>For that I have to open up the monitor and fiddle with some screws and I have no idea what to do. So I most likely end up electrocuting myself.
Well if you want to gather some balls then look up 'CRT safety measures' on google and look at the service manual, it's not really that complicated.
First place the CRT on clean table, place a (big) mirror in a way that you can see the entire screen while you standing behind the CRT.
Just open the thing and carefully adjust the trimmer while looking at a mirror.
That's how i would do it, and i did once touched the connection points of the deflection coils (around 60VAC i guess) while twisting the convergence rings on my TV.
My finger was just numb for 5 minutes and it was rather painless compared to a heatsink of an shitty LCD TV, for some reason it happened to me after a hour when a well experienced coworker got shocked and i didn't saw what he touched. Don't ask me why i did it. I simply didn't expected any kind of voltage on that piece of metal.
i've always found flat crts blur more around the edges, which i imagine is because the screen isn't tangent with the electron beam there
isn't that the point in having a curved screen in the first place? to ensure the beam hits a consistantly 'flat' area?
could also be due to loss of focus around the edges, since a flat screen would mean the center and edges of the screen are slightly different distances from the gun
>PVMs are shielded precisely so you can stack them with speakers or other monitors and not have distortion.
I'm sure they'd play fine with newer shielded speakers, but I have to position my left KG4 just right to not cause discoloration on the 20M2MDU.
Even with a curved screen, the distance between the gun and the center of the screen vs the edges is different. I think it has everything to do with the angle the beam hits the screen at.
There is no way that I am opening it up. It doesn't bother me that much.
I also asked some guys at a repair shop which specializes in old electronics to recalibrate my monitor but they wouldn't do it for some reason.
>Been passing up an old PC CRT in my alleyway for weeks now
>finally decide to go and check it out
>"Eh, it's been saturated with water repeatedly. I guess I could at least use the cable for something"
>already been cut
Was a Dell P991.
New room setup only makes photo taking viable at night, and if the above green text didn't show it already, I'm lazy. I still have folders of photos from months ago that need going through.
When they say CRT projectors, do they really mean Cathode Ray Tube projectors? Are there tubes inside that emit electrons on a phosphorous screen, that then emit photons used in a projection scheme across thin air?
Thank you. How's the color reproduction on those rebadged trinitrons?
Holy shit. After reading a little more, it looks like the CRT projectors have the added advantage of not having to sacrifice screen space to multiple phosphor masks on one screen, so it can have three CRT tubes with one phosphor plane to itself. I'd imagine this enables a higher range of possible resolutions with a more sharper dot pitch. Is this true? Also, how's the color reproduction on them as well? I see that some just use a single projector tube with a color wheel to filter colors, and others have just R, G, or B phosphors on one of three tubes and focuses the three pictures together on your display tarp/screen to make a full RGB projection.
Look at all these CRTs we packed up at work.
It seems such a waste to throw all seven of them away, oh well.
Have a nice day, gentlemen.
What's the point of having a CRT? Everything looks better on my HDTV compared to my CRT. My HDTV also has no input lag. I got it for free because my boss gave me a gift card to Best Buy so there was no problem in it. The blue colors look a little bit darker, but I got used to it. Here it is:
Using RetroArch with a very recent build of the Desmume libretro core that lets you full screen a single DS screen and swap between the two with R3.
Display is a Dell CRT monitor recieving 3840x480 resolution and using a shader chain with slight blur and color adjustment (https://github.com/libretro/common-shaders/blob/master/cgp/tvout-interlacing.cgp). The color intensity of the CRT is cranked up to max to make the image very bright and the scanlines bloom a bit.
16:10 is indeed the best, but only for non-retro stuff and anything else that isn't 4:3 optimized. 16:10 has horizontal resolution than comparable 4:3 displays and more vertical resolution than 16:9 displays.
But for retro stuff (and anything else that's 4:3 optimized), than 4:3 is the definite way to go.
16:9 sucks BTW. I was pissed when I found out I couldn't get a 1920x1200 16:10 screen on a new laptop anymore; I had to settle for a 1920x1080 16:9 screen, and the reduced vertical resolution is somewhat annoying when going through source code or writing music using most chiptune trackers.
Aspect ratio is very purpose-specific. At least with CRTs with high horizontal scan rates you can emulate a good resolution of a different aspect ratio that you need on a tube of a different aspect ratio. And henceforth the only necessary thing on a CRT is the size of the display.
try playing a 60fps 240p game on a lcd dude. Fucking unplayable with half the frames removed and lad out the ass.
Apparently if your vendor has custom intel hd graphics drivers, you should try those, even if they're behind. Also, try restarting the PC after setting the custom resolution.
If it's a monitor it probably supports more vertical lines than a TV, and thus it probably has to scale the vertical lines to what the monitor supports. Thus if the 240p vertical lines are not easily divisible by the monitor's vertical lines you're going to have a blur effect if it scales it to use the entire display. The horizontal lines are all analog though, and it's up to the signal to define how many pixels to display in a line. So when it's blurry, is it just the vertical resolution that's blurry?
In any case if you want to emulate 240p content, you should probably just pick up a normal television with RGB inputs and convert the VGA signal to RGB.
So I'm trying to run my Dell with a pentium 4. with an HD Radeon 2400 through s-video to my Zenith.
I'm this guy >>1796778
So it works sorta, but it's blurry and such, I was wondering what you guys recommended. Mainly what resolution should I set it to for my Zenith there?
Alright, I've done some work and I can't get the upper right corner (and the right side in general, but especially the upper right corner) to be in focus. The convergence seems alright, but it's just blurry as fuck compared to the rest. Changing things in the service menu didn't help... should I open it up and look for a magnet on the tube or maybe the focus board is fucked? It seemed like some of the focus controls pertaining to the upper right corner didn't work at all.
Can't quite make out the specific model of the Zenith. It looks like some crappy image filter was set on your phone, but then again I guess that could be your exposure setting to get the CRT to display correctly.
Anyways, what's the model?
If it's a common TV it probably supports NTSC specification for vertical lines or PAL or otherwise in other countries. You should try outputting the resolution in 240p or 480i (depending on your content) for NTSC compatibility.
I know! I was lucky I found this thing. It was at a thrift store nearby, they always have CRTs I almost didn't see this one, but I spotted the woodgrain from across the store when I was on my way out and went back to check it out. I'm a sucker for old Zenith stuff. I've got a big 32" console in the other room that my son uses for dvds and vhs tapes.
I'm picking up a 15" sony pvm to go underneath there. The saumsung I have is really screwed up. I'd like to fix it, but I don't know where to start. I'll post a pic of it messing up and see what you guys say.
Fair enough, the s-video probably isn't very good quality. I also have tried running it through an s-video-->coaxial box and using it that way. The colors are a little nicer through that.
So I found myself a heavily shielded and high quality (via separate coaxial components) RGBHV cable (from Japan) at a thrift store. What type of CRTs accept that signal? I know I can basically convert a VGA signal to an RGBHV signal without any loss, but what would natively take in an RGBHV signal? Trinitrons? Arcade tubes? Looking for something that can do native 240p and 480i content well with high color reproduction (I care less about physical dimensions of the screen). Bonus points if it has a higher multiple order of vertical lines and resolution support than just 240p and 480i (yet still able to render the lower resolutions without quality loss).
>I know I can basically convert a VGA signal to an RGBHV signal without any loss
There is no "conversion," it's the same signal.
>what would natively take in an RGBHV signal?
Plenty of pro monitors and some cojmputer monitors.
Yes. In fact, one of my friends nabbed a 21'' Diamondtron off the side of the road that took that connector. Not only did it do up to 1600x1200 at 85hz, it went all the way down to 240p. The video signal was messed up from water damage, but it would have been amazing.
I'm a complete outsider to these threads, but I'm fucking curious.
I see every thread filled with people showing 100x zoom shots of their screens, what is this supposed to prove? can you really tell how good a CRT is by looking at these pictures? how? They all look the same to me.
I'm just posting at the full resolution that my camera takes photos at; Completely unedited. Most of the photos I try to go for full screen shot, unless I'm trying to get a certain part of the screen in focus, like the SMRPG shot above and a significant portion of my Super Metroid shots.
>what is this supposed to prove? can you really tell how good a CRT is by looking at these pictures
Cameras really aren't capable of showing how nice a CRT looks in person, but it can give you a good idea.
there's no way to accurately capture how nice an image on a crt looks with a photo
but you can show how the scanlines look, so sometimes we take pictures close up just to show that, rather than trying to show the whole image (most here are well aware of how nice crt's look overall)
>Diamondtron is just a technology that monitors implement
It was essentially a brand name for a line of monitors that popped up after Sony's patent on the aperture grille design ran out.
I believe some of Samsung's SyncMaster crts also accepted BNC for RGBHV. Not really sure where you'd be able to make use of the cable, considering the other end seems to be RCA. Maybe running from an RCA based switch to a PVM or similar; Would actually be nice to not have to have the little adapters attached.
The RCA plugs are simply adapters for the coaxial cables that can be removed. Luckily all of them for one end of the BNC cable came with it regardless of finding it in a thrift store.
Do you know if the Samsung series has good quality picture at 240p/480i?
For games though: Are natural colors desired on something that was unnaturally fabricated?
Officially I think if you're in an environment with lots of fluorescent lights like in an office you may want a different color temperature.
I think I would prefer gaming in as dark a room as possible, so that I can experience as much contrast and as deep black levels as possible, and the colors would be more vibrant. Would the 'natural' 6500k temperature suit me in that scenario?
Color temp settings should match ambient lighting. It's best to use 6500K unless you need to match unnaturally bright (or blue) conditions like a showroom or industrial setting with lots of artificial white lighting.
Look around your gaming area. Are you playing in the dark or soft yellow light, or harsh white/blue light? You can pick a color temp to match that, so that your whites look natural.
>whites extra white
Since color temp changes white point, neither end of the spectrum should be called "extra white." Instead you could say you like your whites extra blue instead of extra red.
That would be more technically correct, but I was speaking based on what my personal opinion of white should be.
6500k just leaves everything look so yellow, even after giving my eyes time to adjust.
Pretty much any high-end computer monitor accepts BNC.
Panasonic has some nice, small, high-quality computer monitors with BNC RGBHV in. They also accept RGB with SoG and RGBC.
A little warning to all those who want to use BNC RGBHV though, adapting it to or from VGA or other similar display interfaces is not as easy as you might think.
Pretty much all display interfaces have data besides the actual image data travel in the cable too, and that data may be required for proper operation. This data may include things like supported resolutions and aspect ratio.
>6500k just leaves everything look so yellow
The Sun is yellow, hence the 'naturalness'. If we had a Sun with a color temperature of 93000K (like young stars) we would all look paler than we do, even if our whites would be 'whiter'.
>Pretty much all display interfaces have data besides the actual image data travel in the cable too, and that data may be required for proper operation. This data may include things like supported resolutions and aspect ratio.
Well for this meta data that travels on a different (I'll call it a) channel (aka frequency of operation) on the same copper wire, you don't have to do anything special to a VGA<->BNC adapter as long as transferring from one cable type to another preserves the spectrum information (all the channels) carried by the signal. The only time interference with meta data would come into play is when you have a mediating device that strips the signal of all information except for the part of the spectrum that contains optical signal data.
>the only time...is when you have a mediating device that strips the signal of all information except...optical signal data
I am going to revise this to say that if the monitor outputs meta information about its capabilities to the computer via its VGA-in port, but not via a BNC-in port because it doesn't expect a computer on the other end of the BNC connection unlike with VGA, then you have a valid point. But computers have video cards that can output to TV-centric connections like composite and S-Video, and they work just fine. I'd estimate (not ever having connected a computer to a Television via a dumb signal port) that the computer would have difficulty assessing what type of display you had when you plug it in, but regardless you should be able to coax the Operating System to output to whatever video mode you want (albeit you have to be extra careful when you have this control as you could set a refresh rate not compatible with the display and whatnot).
I actually have limited knowledge on the meta data that travels through a VGA cable (or otherwise) to the monitor, but having thought a little bit more about it, I don't think it's possible to transmit a signal at the same time one is being received with electronics because of the polar flow of data towards one direction. So I'm probably wrong.
But I'd estimate that perhaps when the computer is initially identifying a display it has a protocol in place to listen via its display connection until it receives a signal from a connected monitor, and the computer might send data to the monitor asking for its capabilities, then the monitor may send that over the video cable, and finally after the computer receives the information it will transmit an appropriate signal.
If I was creating the interfaces between computer and display I would make sure to use an appropriate networking handshake to make sure it was okay to transmit data instead of blindly sending something forward that might harm the display.
>the computer might send data to the monitor asking for its capabilities
VGA has special pins for this, as do most other display interfaces. There's also some display interfaces which require certain pins to be shorted to indicate what kind of monitor is connected, if any. SCART for example requires specific voltage input to certain pins for detecting the proper aspect ratio.
Since special pins are used for this data, just sending RGBHV to the monitor will not include this data. This can cause, among other things, problems such as the display not turning on at all due to not knowing there's a display there.
Alternatively the image source could also expect data back, so the source may try to output wrong kind of image signal to the monitor, causing all kinds of problems, or it may not acknowledge there being a monitor in the first place.
TL;DR RGBHV BNC is just image signal. Some devices expect more, so not everything is compatible with it.
So it's a matter of using the extra pins in the D-subminiature. How does display detection work for graphics cards that can output 'dumb' video signals like Composite or S-Video?
Also, couldn't you get or make a device which could connect to the miscellaneous pins on the D-sub and send fake meta data when you want to interface with your 'dumb' BNC display?
>How does display detection work for graphics cards that can output 'dumb' video signals like Composite or S-Video?
It doesn't most of the time. Some GPU can detect if there's a cable connected, but that's a very rare feature, and not all that useful.
If you've ever tried to connect a computer to a TV via composite, then you know exactly how much of a pain in the ass it is.
>couldn't you get or make a device which could connect to the miscellaneous pins on the D-sub and send fake meta data when you want to interface with your 'dumb' BNC display?
EDID emulators are a thing. They're also ridiculously expensive more often than not.
Speaking of needing to implement clever mechanisms for the allowance of different colored phosphors in televisions: Would it be possible to produce a 4K+ B/W CRT with a solid plane of white phosphor instead of slotted and space-consuming sub-pixel phosphors?
Can electrons have a good enough pitch with modern technology to produce ultra high resolutions on B/W CRTs if given the chance? Also, a video signal would require less bandwidth to implement a B/W picture, which would also mean less memory and process requirement for games to render in B/W so you could maybe get away with 4K gaming at 60+Hz for the colorblind on optimized titles?
i used to have a 4" b/w tv which i played ps1 and sometimes ps2 on, the picture was very sharp
i see no reason why you can't display whatever resolution you like onto b/w crt's, given suitable circuitry to drive it
oh yea, as for optimization
the only thing you're likely to get is using 8bit per pixel instead of 24 for the framebuffer, not a significant difference nowadays
having b/w textures would be much more significant difference, mind
Yes, that's what I'm implying. The textures may be stored as RGB textures of 32 bit or higher on the hard drive or whatnot, but then as games dynamically load resources into memory it can start a 'conversion' process to store the textures in an 8-bit (16-bit for alpha) B/W format in main memory when colorblind mode is enabled.
CAPTCHA: the assrun
mm, obviously textures would need to be converted, so it would be a tradeoff of more cpu time (when loading in textures) for reduced video ram usage
i don't know what kind of difference it would make overall
but i'm not sure this would really be worth it, i mean, how many people are fully colorblind? most 'colorblind' people can still see color to some degree
oh yea, in some cases it won't make any difference
for example, it's common the psp to use 8bit palleted color + 8bit alpha as it is, so 8bit greyscale would save no resources (only appear smoother due to not wasting potential shades of grey on alternate hues)
>$45,000 for an HD reference CRT
Yes, but can it run Crysis?
been tweaking some modelines so i have a small and even amount of overscan on various resolutions
i bumped the scart cable to my tv just now, and the sync pin seems to be just barely connecting, even the slightest movement to the desk or even the tv causes it to lose sync for a moment
i thought it was kinda funny, so i took a video of it
Thing is fucking huge. 24" LCD for scale.
>No /vr/ yet because I spent all last night trying to get the right side in focus and haven't bothered setting up 240p or any of that stuff
Well if you're a faggot then that's okay I guess. There used to be this little faggot that was a cashier at Big Bear. 50s, crazy tan, bleached blonde hair, tons of jewelry and the most tastefully manicured fingernails I'd ever seen. Not some adolescent dumbfuck in a dress trying to hide his jawline behind an auburn wig making everyone uncomfortable exercising "muh rights" in the women's rest room but a fully realized, mature, confident faggot. That's how I like my queers.
It's an HDCRT set up for progressive scan digital inputs so in all likelihood, analog interlaced SD signals will be upconverted and suffer the same drawbacks as on other HDTVs, minus motion blur. The only CRTs you can rely on to do HD and SD well are pro monitors that advertise a number of "lines" equivalent to double the progressive scan resolution you're after. Usually something like "1440 lines" which will let it do 720p and 1080i if it supports a high enough frequency. I've never seen one that has enough "lines" to do 1080p and also syncs down low enough to do 240p at 15khz but such a wizard probably does exist somewhere.
Game Boys have a hard, digital resolution since they were built to use their integrated display with a matching hard resolution. When you use a game boy player or even emulate, you're interpolating. For full screen game boy you'd get best results from a fixed resolution display that is as close to a multiple of the Game Boy's native resolution - Although KyaDash posts pictures of GBC/A via Wii on his PVM and it looks more or less like S/NES graphics.