Sure. So dimension is the number of numbers (say that 5 times fast haha) in your embedding and metric is how they're compared.
So for example, OpenAI has the text-embedding-3-largetext-embedding-3-large model which outputs dimension 3,0723,072, or text-embedding-3-smalltext-embedding-3-small which outputs 1,5361,536, etc. When you embed 1 text they will give you back an array of <n> numbers where <n> is the dimension. With OpenAI you can request smaller embeddings by specifying the dimension you want, but they'll lose some accuracy as a result.
Metric is how to match embeddings. Generally you want cosinecosine unless the model/provider tells you otherwise (OpenAI recommends cosinecosine for their models). Euclidian also works and will give identical numbers to cosinecosine (if the embeddings are normalized which they often are) but cosinecosine is a bit faster and has some other benefits.