We present a series of results regarding conceptually simple algorithms for bipartite matching in various online and related models. We first consider a deterministic adversarial model. The best approximation ratio possible for a one-pass deterministic online algorithm is 1⁄2, which is achieved by any greedy algorithm. Durr et al. recently presented a $2$-pass algorithm called Category-Advice that achieves approximation ratio 3⁄5. We extend their algorithm to multiple passes. We prove the exact approximation ratio for the $k$-pass Category-Advice algorithm for all $k \ge 1$, and show that the approximation ratio converges to the inverse of the golden ratio $2/(1+\sqrt{5}) \approx 0.618$ as $k$ goes to infinity. The convergence is extremely fast — the $5$-pass Category-Advice algorithm is already within $0.01\%$ of the inverse of the golden ratio. We then consider a natural greedy algorithm in the online stochastic IID model—MinDegree. This algorithm is an online version of a well-known and extensively studied offline algorithm MinGreedy. We show that MinDegree cannot achieve an approximation ratio better than $1-1/e$, which is guaranteed by any consistent greedy algorithm in the known IID model. Finally, following the work in Besser and Poloczek, we depart from an adversarial or stochastic ordering and investigate a natural randomized algorithm (MinRanking) in the priority model. Although the priority model allows the algorithm to choose the input ordering in a general but well defined way, this natural algorithm cannot obtain the approximation of the Ranking algorithm in the ROM model.