0
\$\begingroup\$

Say I've got a character model where part of the face has two UV maps: one's mapped to the generic body texture, the other covers a small area on the face to allow for a highly detailed tattoo.

I want to create a shader graph that will let me dynamically blend the tattoo over the skin texture, but I can't figure out how to process both textures so that they can be fed into a blend node.

\$\endgroup\$
3
  • \$\begingroup\$ What did you try, and how did the outcome differ from what you want? \$\endgroup\$ Commented Sep 2, 2022 at 21:22
  • \$\begingroup\$ If I just blend two textures together I get a tattoo the size of the whole body texture, whereas I want it to cover a very specific position. Not sure how to get it figure out said position. \$\endgroup\$ Commented Sep 2, 2022 at 21:50
  • 1
    \$\begingroup\$ Maybe your question should say and show that. Presumably you considered multiplying the UV coordinate you passed into the tattoo texture sampler? \$\endgroup\$ Commented Sep 2, 2022 at 22:30

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.