How do I make one thousand dollars per month making low poly shit on the internet?
It still takes way too long (you will earn less hourly than working at burgerking) and porn customers are the worst clients you can think of.
If you want to get quick money, do stream commissions. Streaming your session and shit out drawn sketches every 10 minutes for $15 to 20 bucks a pop.
Say I wanted to do ghetto diy quixel megascans. What kind of equipment would I need?
For a start, let's consider what you're trying to emulate.
Quixel has numerous custom-built scanners around the world that are each tuned to capturing different types of objects, but by and large it's simply a very accurate, automated method of photogrammetry.
What they do is basically construct a large box containing in it all possible lighting, filtering and capture equipment, which can isolate a roughly 2x2 meter area from any kind of outside light source (they even do this at night just to make sure), and it automatically captures and processes all necessary images on the spot.
The only real thing that seperates their results from just doing the plain way is specifically the high degree of standardization and color accuracy across all assets, and that they can do it on a dirt road outside instead of a studio.
Even if you don't care for that degree of perfection, the basics of this type of processing still apply: you need a camera with a very wide dynamic range, multiple lights you can set up from every direction, and polarizing filters - a glass screw-on filter for the camera lens and flexible gel filters for the lights. The first phase is to light the surface you want to capture from 4 to 8 different angles and take a shot of each one, the second phase is to extract the albedo and specularity by taking two additional shots with all lights on, one regular image and one with the mentioned polarizing filters fitted. make sure to mark the positions of where all filters are "in phase" to make sure none of the lights are producing reflections.
Polarizing filters will remove all reflections from the surface of an object, so you can produce a nice albeido map from it, as well as using the difference between it and the regular exposure to produce a specular map.
To actually process the images you shot, you need Substance Designer 6 or 2017, as it has nodes specifically designed for this, where you input the directionally-lit images and it figures out the rest.
Getting on to equipment, a camera with a sensor which has very high dynamic range is preferable as photogrammetry goes to shit if you have blown highlights or crushed shadow detail, not a problem in a studio or with otherwise very controlled lighting, but you can't always depend on things going your way.
Luckily such cameras today are nowhere nearly as expensive as they used to be, you want something equipped with a modern Sony sensor. This could be a Sony A7-series camera, or a Nikon D8xx, even if you have to get it second-hand. Stay away from Canon.
For lights you generally want some sort of daylight-balanced source with a high CRI rating, a poor color rendering index may cause issues with metamerism - colors shifting hue due to poor light properties. Our eyes aren't especially sensitive to this, but camera sensor very much are. You can get away with around 2-4 lights but of course the more you have, the less work you'll have to do moving shit after you set them up.
Hi! Thanks for all that very interesting information. Saved it for future reference.
One thing I'm wondering is how they do all their additional maps. They offer:
Is all of this calculated from geometry, color and albedo?
>What they do is basically construct a large box containing in it all possible lighting
Interesting. This would explain why their tree scans appear to be chopped off, but I'm not sure how they would deal with something like these scans: https://megascans.se/library/packs/lava-field
It was time for me to learn electronics and optics anyway, and I'm not afraid to get my hands dirty.
>To actually process the images you shot, you need Substance Designer 6 or 2017, as it has nodes specifically designed for this, where you input the directionally-lit images and it figures out the rest.
I was more thinking about doing my own software for this, which I plan to open source together with the scans (free for non commercial use).
>you need a camera with a very wide dynamic range
I really wanted an excuse to buy one.
Can you help me to model a photorealistic sheet holder, using cycles shader?
What is the best place to get free Daz content ? CGpeers doesn't have that much but the mega packs are great. The piratebay has a lot of content and good variety, and sometimes you can find very specific things in direct download blogs
>pay for it
previous one is saging
what do you guys use to verify models before printing? I've been using shapeways because I can't find an alternative that works. would be great to have something i can run on my desktop rather than online
Why are my finger bones lifting away from the mesh? Shouldn't it be bringing the mesh with it?
Don't slicers verify that a model is solid? I think Meshmixer has a "make solid" button.
God damn it every time I ask a question I figure it out half an hour later. Turns out vertices has associations with other vertex groups, despite having zero weight in them, so when i tried to rotate one, it tried to meld between them. Going to Weight Paint -> Clean 0.01 fixed them.
someone, anybody, aching, light my cigarette!
I know it's a tired topic, but for real. Hammer did it right and SabreCSG does it even better. Check this shit out:
Why should I not be able to do this in Blender? Why is there no interest in developing a plugin that works along these lines? Why am I still marking seams on simple geometry?
Actually, I'm not. I'm using Valve's Hammer for DOTA 2 to model simple shapes (currently working on a level for a very old game that requires a lot of restraint in terms of polycount and texture size). I would use SabreCSG because it's even better, but Unity sucks for working with huge unit counts and SabreCSG ends up bugging the fuck out and leaving the grid. Then I just export as OBJ (UVs are carried over) and load it into Blender for more precise control over normals. And that's about it, export right into the game.
Why is this approach not more popular?
It's crazy how backwards level design feels nowadays. Brushes just felt like much more streamlined and designer friendly way to build world geometry. I think that when Unreal Engine became popular people got this idea in their head that brush based geometry was a thing of the past just because UE's level editor tools were so horribly unoptimized and unstable. The truth is Epic are a bunch of lazy fuckers and instead decided to divert their criticisms instead actually bother to address their broke ass tools.
It's sad when your million dollar game engine has worse tools than a game that came out in 1996.
Any of you know some good bullet tutorial?
Wanna improve mine: https://www.youtube.com/watch?v=x7dB2zYqO_8
Vicky 8 needs those sheckles edition
Lazy pastebin update: never
Couple of Qs
1. How the flying fuck does rigidity maps work? I must be doing something wrong cause I thought rigidity maps meant making rigid clothing like armor without worrying about it distorting in the chest. Good luck using hoops in your clothing too..
2.Everyone seems to make amazing textures. Is this from scans or are they painted? And either way, how do they do it?
3. Has anyone tried animating in cycles or 3dmax? Iray takes too long for me so I'm hoping there are alternatives.
What does /3/ think of this whale that model I made im zbrush? its not fully textured yet.
>baking realtime global illumination in unity
>moving static objects away from their light baked positions after light baking
Hi, dear /3/rothers
Would any of you be so kind to download this 3ds Max 2017 file and convert it into 3ds Max 2014 or older, please?
I do not have enough time to download that version and I'm in a big hurry
Please help me, guys
>Job interview in Florida
>I have one year professional experience doing commercial 3D work
>Fudged and said I knew a little Unreal Engine, Unity and C4D
>Friend who got a job basically said they didn't know After Effect animation for a job that wanted it, lied about it, then learned it in overhaul mode over a weekend before starting the job she now has worked at for a year
>In reality: I have made 2D character rigs and animations for a full game in Unity, I have watched a bunch of tutorials for C4D and I have only physically opened Unreal before
>I said that I have "dabbled" in C4D and Unreal
>Said I "dabbled with Unreal Particles"
>Position is as a 3D generalist, they said only one year experience is required
I already know particles systems in Maya. I have a follow up interview in one week. I don't know when the position would fully start.
I know I shouldn't have lied, but I think the follow up interview will be going more in-depth about my knowledge. They don't seem to need someone heavy on programming, just making effects, animations trees for characters and so forth, which I've done in Unity.
Can I realistically cram this in one week possibly two? I think I only need a functional knowledge. I've been unemployed for a month so I have an open schedule, what should I hit for a basics of Unreal? I'm not as worried about C4D as it seems pretty similar to MAYA and 3DS, I feel I could get up and running pretty fast.
The main things they seemed interested in was Unreal particles and rigging. I know a lot about rigging in MAYA, enough to code my own python script to auto rig a character. I also know a lot about MASH and nParticles in MAYA.
yes I know it probably wasn't the best course of actions, but they can always reject me.
First off, why the fuck are you making a thread and not cramming this shit in right now, and up until the interview? Second, are we talking about Vector particles, HLSL/OpenGL particles, or just Unreal systems? If just Unreal, you should easily be able to learn the common knowledge of particles within a week, especially if you're good with particles in Maya.
t. vfx support
Unreal "Cascade" system particles. On a bus right now.
The role they are asking says "only one year 3d modeling required" but they basically want someone who can do literally every facet of game design that isn't hard coding. Might be pretty unrealistic, but I've not worked for a true game studio before, just as a 3D generalist in commercials.
they are an expanding startup.
Muh diiiick! Even if you're not a fan of rgb gamershit you gotta admit this board is aesthetic as fuck
What is the best & fastest way to model this game asset? I have a couple ideas in mind but I would like to hear from more experienced users.